Oh my god. It's full of code!

Salesforce

Apex list all fields on page layout

Hey everyone, I know it’s been a while but I am in fact still alive. Anyway, I’ve got something new for ya. I’ve been asked to describe fields that are actually being used by evaluating page layouts. Fields that are actually being used will need have their values exported from a legacy SF instance and imported into a new one. So instead of having to manually go through and log every field and find it’s data type which would be crazy slow an error prone, I wrote a nifty little script. You simply feed it page layout names and it will get all the fields on them, describe them, create suggested mappings and transformations then email you the results with each layout as a separate csv attachment so you can use it as a starting point for an excel mapping document. It can also describe the picklist values for every field on any object described. Hopefully your sandbox is the same as your production so you can just save this in there and run it without having to deploy to prod. Remember to turn on email deliverability when running from a sandbox! This is still pretty new, as in my first time using it immediately after building it so if you find errors with it’s output or have any suggestions I’m definitely open to hearing about them in the comments.

UPDATE: After adding some more features it became too large to be an execute anonymous script. It’s now been converted to a class. So save this into a new apex class then from execute anonymous call LayoutDescriber.SendLayoutInfo() to run with default settings or pass in a list of page layoutnames and if you want to get picklist values or not. If you want to run it as a script you can remove the picklist value builder lines 156-210 and the check for valid page layout names lines 26-41. That should get it small enough to run.

public class LayoutDescriber
{
    /**
    *@Description gets all the fields for the provided page layouts and emails the current user a csv document for each. It
                  also gets related field data and provides suggested mapping configuration for import. Ooptionally can get picklist values for objects.
    *@Param pageLayoutNames a list of page layout names. Format is [obectName]-[namespace]__[Page layout name]. 
            Omit namespace and underscores if layout is not part of a managed package.
            EX: Account-SkienceFinSln__Address 
            OR
            EX: Account-Account Layout
    @Param getPicklistValues flag that controls whether picklist values for described objects should be included.
    **/
    public static void sendLayoutInfo(list<string> pageLayoutNames, boolean getPicklistValues)
    { 
        List<Metadata.Metadata> layouts = Metadata.Operations.retrieve(Metadata.MetadataType.Layout, pageLayoutNames);
        
        for(string layOutName : pageLayoutNames)
        {
            boolean layoutFound = false;
            for(integer i = 0; i < layouts.size(); i++)
            {
                Metadata.Layout layoutMd = (Metadata.Layout) layouts.get(i);
                if(layoutMd.fullName == layOutName)
                {
                    layoutFound = true;
                }
            }
            if(layoutFound == false)
            {
                throw new applicationException('No layout with name' + layoutName + ' could be found. Please check and make sure namespace is included if needed');
            }
        }
        map<string,map<string,list<string>>> objectPicklistValuesMap = new map<string,map<string,list<string>>>();
        
        map<string,list<string>> objectFieldsMap = new map<string,list<string>>();
        
        for(integer i = 0; i < layouts.size(); i++)
        {
            Metadata.Layout layoutMd = (Metadata.Layout) layouts.get(i);
        
            list<string> objectFields = new list<string>();
            
            for (Metadata.LayoutSection section : layoutMd.layoutSections) 
            {        
                for (Metadata.LayoutColumn column : section.layoutColumns) 
                {
                    if (column.layoutItems != null) 
                    {
                        for (Metadata.LayoutItem item : column.layoutItems) 
                        {
                            if(item.field == null) continue;
                            objectFields.add(item.field);
                        }
                    }
                }
            }
            objectFields.sort();
            objectFieldsMap.put(pageLayoutNames[i].split('-')[0],objectFields);
        }
        
        system.debug(objectFieldsMap);
        
        Map<String, Schema.SObjectType> globalDescribe = Schema.getGlobalDescribe();
        
        Map<String, Map<String, Schema.SObjectField>> objectDescribeCache = new Map<String, Map<String, Schema.SObjectField>>();
        
        String userName = UserInfo.getUserName();
        User activeUser = [Select Email From User where Username = : userName limit 1];
        String userEmail = activeUser.Email;
        
        Messaging.SingleEmailMessage message = new Messaging.SingleEmailMessage();
        message.toAddresses = new String[] { userEmail };
        message.subject = 'Describe of fields on page layouts';
        message.plainTextBody = 'Save the attachments and open in excel. Fieldnames and types should be properly formated.';
        Messaging.SingleEmailMessage[] messages =   new List<Messaging.SingleEmailMessage> {message};
        list<Messaging.EmailFileAttachment> attachments = new list<Messaging.EmailFileAttachment>();
        
        integer counter = 0;    
        for(string thisObjectType : objectFieldsMap.keySet())
        {
            list<string> fields = objectFieldsMap.get(thisObjectType);
            
            Map<String, Schema.SObjectField> objectDescribeData;
            if(objectDescribeCache.containsKey(thisObjectType))
            {
                objectDescribeData = objectDescribeCache.get(thisObjectType);
            }
            else
            {
                objectDescribeData = globalDescribe.get(thisObjectType).getDescribe().fields.getMap();
                objectDescribeCache.put(thisObjectType,objectDescribeData);
            }
        
        
            string valueString = 'Source Field Name, Source Field Label, Source Field Type, Source Required, Source Size, Is Custom, Controlling Field, Target Field Name, Target Field Type, Target Required, Transformation \r\n';
            for(string thisField : fields)
            {
                if(thisField == null || !objectDescribeData.containsKey(thisField))
                {
                    system.debug('\n\n\n--- Missing field! ' + thisField);
                    if(thisField != null) valueString+= thisField + ', Field Data Not Found \r\n';
                    continue;
                }
                
                Schema.DescribeFieldResult dfr = objectDescribeData.get(thisField).getDescribe();
                
                if( (dfr.getType() == Schema.DisplayType.picklist || dfr.getType() == Schema.DisplayType.MultiPicklist) && getPicklistValues)
                {
                    List<String> pickListValuesList= new List<String>();
                    List<Schema.PicklistEntry> ple = dfr.getPicklistValues();
                    for( Schema.PicklistEntry pickListVal : ple)
                    {
                        pickListValuesList.add(pickListVal.getLabel());
                    }     
        
                    map<string,list<string>> objectFields = objectPicklistValuesMap.containsKey(thisObjectType) ? objectPicklistValuesMap.get(thisObjectType) : new map<string,list<string>>();
                    objectFields.put(thisField,pickListValuesList);
                    objectPicklistValuesMap.put(thisObjectType,objectFields);
                }
                boolean isRequired = !dfr.isNillable() && string.valueOf(dfr.getType()) != 'boolean' ? true : false;
                string targetFieldName = dfr.isCustom() ? '' : thisField;
                string targetFieldType = dfr.isCustom() ? '' : dfr.getType().Name();
                string defaultTransform = '';
                
                if(dfr.getType() == Schema.DisplayType.Reference)
                {
                    defaultTransform = 'Update with Id of related: ';
                    for(Schema.sObjectType thisType : dfr.getReferenceTo())
                    {
                        defaultTransform+= string.valueOf(thisType) + '/';
                    }
                    defaultTransform.removeEnd('/');
                }    
                if(thisField == 'LastModifiedById') defaultTransform = 'Do not import';
                valueString+= thisField +',' + dfr.getLabel() + ',' +  dfr.getType() + ',' + isRequired + ',' +dfr.getLength()+ ',' +dfr.isCustom()+ ',' +dfr.getController() + ','+ 
                              targetFieldName + ',' + targetFieldType +',' + isRequired + ',' + defaultTransform +'\r\n';
            }
        
            Messaging.EmailFileAttachment efa = new Messaging.EmailFileAttachment();
            efa.setFileName(pageLayoutNames[counter]+'.csv');
            efa.setBody(Blob.valueOf(valueString));
            attachments.add(efa);
            
            counter++;
        }
        //if we are getting picklist values we will now build a document for each object. One column per picklist, with it's rows being the values of the picklist
        if(getPicklistValues)
        {
            //loop over the object types
            for(string objectType : objectPicklistValuesMap.keySet())
            {
                //get all picklist fields for this object
                map<string,list<string>> objectFields = objectPicklistValuesMap.get(objectType);
                
                //each row of data will be stored as a string element in this list
                list<string> dataLines = new list<string>();
                integer rowIndex = 0;
                
                //string to contains the header row (field names)
                string headerString = '';
                
                //due to how the data is structured (column by column) but needs to be built (row by row) we need to find the column with the maximum amount of values
                //so our other columns can insert a correct number of empty space placeholders if they don't have values for that row.
                integer numRows = 0;
                for(string fieldName : objectFields.keySet())
                {
                    if(objectFields.get(fieldName).size() > numRows) numRows = objectFields.get(fieldName).size();
                }
                
                //loop over every field now. This is going to get tricky because the data is structured as a field with all its values contained but we need to build
                //our spreadsheet row by row. So we will loop over the values and create one entry in the dataLines list for each value. Each additional field will then add to the string
                //as required. Once we have constructed all the rows of data we can append them together into one big text blob and that will be our CSV file.
                for(string fieldName : objectFields.keySet())
                {
                    headerString += fieldName +',';
                    rowIndex = 0;
                    list<string> picklistVals = objectFields.get(fieldName);
                    for(integer i = 0; i<numRows; i++ )
                    {
                        string thisVal = i >= picklistVals.size() ? ' ' : picklistVals[i]; 
                        if(dataLines.size() <= rowIndex) dataLines.add('');
                        dataLines[rowIndex] += thisVal + ', ';
                        rowIndex++;        
                    }
                }
                headerString += '\r\n';
                
                //now that our rows are constructed, add newline chars to the end of each
                string valueString = headerString;
                for(string thisRow : dataLines)
                {            
                    thisRow += '\r\n';
                    valueString += thisRow;
                }
                
                Messaging.EmailFileAttachment efa = new Messaging.EmailFileAttachment();
                efa.setFileName('Picklist values for ' + objectType +'.csv');
                efa.setBody(Blob.valueOf(valueString));
                attachments.add(efa);        
            }
        }
        
        
        message.setFileAttachments( attachments );
        
        Messaging.SendEmailResult[] results = Messaging.sendEmail(messages);
         
        if (results[0].success) 
        {
            System.debug('The email was sent successfully.');
        } 
        else 
        {
            System.debug('The email failed to send: ' + results[0].errors[0].message);
        }
    }
    public class applicationException extends Exception {}
    
    public static void sendLayoutInfo()
    {
        list<string> pageLayoutNames = new List<String>();
        pageLayoutNames.add('Account-Account Layout');
        pageLayoutNames.add('Contact-Contact Layout');
        pageLayoutNames.add('Opportunity-Opportunity Layout');
        pageLayoutNames.add('Lead-Lead Layout');
        pageLayoutNames.add('Task-Task Layout');
        pageLayoutNames.add('Event-Event Layout');
        pageLayoutNames.add('Campaign-Campaign Layout');
        pageLayoutNames.add('CampaignMember-Campaign Member Page Layout');
        sendLayoutInfo(pageLayoutNames, true);
    }
}


The result is an email with a bunch of attachments. One for each page layout and one for each objects picklist fields (if enabled).

mmmmm attachments

For example this is what is produced for the lead object.

Nicely formatted table of lead fields and suggested mappings.

Nicely formatted table of lead fields and suggested mappings.

 

And here is what it built for the picklist values

Sweet sweet picklist values. God a love properly formatted data.

Anyway, I hope this might help some of ya’ll out there who are given the painful task of finding what fields are actually being used on page layouts. Till next time.


Salesforce development is broken (and so am I)

Before I begin this is mostly a humor and venting post. Don’t take it too seriously.

So I’m doing development on a package that needs to work for both person accounts and regular accounts. Scratch orgs didn’t exist when this project was started so we originally had a developer org, then a packaging org which contained the namespace for the package (this ended up being a terrible idea because all kinds of weird bugs start to show up when you do your dev without a namespace and then try to add one. Any dynamic code pretty much breaks and you have to remove the namespace from any data returned by apex controllers that provide data to field inputs in lightning, field set names, object names, etc all get messed up.

Still after adding some work arounds we got that working. However since the developer org doesn’t have person accounts we need another org that does to add in the extra bits of logic where needed. We wanted to keep the original dev org without person accounts as it’s sort of an auxiliary feature and didn’t want it causing any problems with the core package.

Development of the core package goes on for about a year. Now it’s time to tackle adding the extra logic for person accounts which in themselves are awful. I don’t know who thought it was a good idea to basically have two different schemas with the second being a half broken poorly defined bastardization of the original good version. Seriously they are sometimes account like, sometimes contact like, the account has the contact fields but a separate contact object kind of exists but you cannot get to it without directly entering the Id in the URL. The whole thing barely makes any sense. Interacting with them from apex is an absolute nightmare. In this case account and contact data are integrated with a separate system, which also has concepts of accounts and contacts. So normally we create an account, then tie contacts to it. In the case of person accounts we have to create some kind of weird hybrid of the data, creating both an account and contact from one object, but not all the data is directly on the account. For example we need to get the mailing address off the contact portion and a few other custom fields that the package adds. So we have to like smash the two objects together and send it. It’s just bizarre. Anyway at this point scratch orgs exist but we cannot create one from our developer org for some reason, the dev hub options just doesn’t exist. The help page says dev hub/scratch orgs are available in developer orgs, but apparently not in this specific one for no discernible reason.

We cannot enable them in our packaging org either as you cannot enable dev hub from an org with namespaces. So my coworker instead enables dev hub from his own personal dev org and creates me a scratch org into which I install the unmanaged version of the package to easily get all the code and such. Then I just manually roll my changes from that org into dev, and from dev into packaging. That works fine until the scratch org expires, which apparently it just did. Now I cannot log into it, and my dev is suddenly halted. There were no warning emails received (maybe he did, but didn’t tell me) and no way to re-enable the org. It’s just not accessible anymore. Thank goodness I have local copies of my code (we haven’t really gotten version control integrated into our workflow yet) or else I’d have lost any work.

I now have to set out to get a new org setup (when I’m already late for a deadline on some fixes). Fine, so I attempt to create a scratch org from my own personal dev org (which itself is halfway broken, it still has the theme from before ‘classic’. Enabling lightning gives me a weird hybrid version which looks utterly ridiculous).

I enable dev hub and set out to create my scratch org from VS code (I’ve never done this so I’m following a tutorial). So I create my project, authorize my org, then lo and behold, an error occurs while trying to create my scratch org “ERROR running force:org:create: Must pass a username and/or OAuth options when creating an AuthInfo instance.” I can’t find any information on how to fix this, I tried recreating the project, reauthorizing and still nothing. Not wanting to waste anymore time, I say fine I’ll just create a regular old developer org, install the un-managed package and enable person accounts.

I create my new dev org (after some mild annoyance and not being able to end my username with a number) and get it linked to my IDE. So now I need to enable person accounts, but wait you cannot do that yourself. You have to contact support to enable that and guess what Salesforce no longer allows you to create cases from a developer org. Because this package is being developed as an ISV type package I don’t have a prod org to login to create a case from. So now I’m mostly stuck. I’ve asked a co-worker who has access to a production org to log a case, and giving them my org ID, I’m hoping support will be willing to accept a feature request for an org other than the one the case is coming from. Otherwise I don’t know what I’ll do.


I’m sure once things mature more it’ll get better, and a good chunk of these problems are probably my own fault somehow but still, this is nuts.


Image

Salesforce Lightning DataTable Query Flattener

So I was doing some playing around with the Salesforce Lightning Datatable component and while it does make displaying query data very easy, it isn’t super robust when it comes to handling parent and child records. Just to make life easier in the future I thought it might be nice to make a function which could take a query returned by a controller and ‘flatten’ it so that all the data was available to the data table since it cannot access nested arrays or objects. Of course the table itself doesn’t have a way to iterate over nested rows so the child array flatted function is not quite as useful (unless say you wanted to show a contacts most recent case or something). Anyway, hopefully this will save you some time from having to write wrapper classes or having to skip using the data table if you have parent or child nested data.

Apex Controller

public with sharing class ManageContactsController {

    @AuraEnabled
    public static list<Contact> getContacts()
    {
        return [select firstname, name, lastname, email, phone, Owner.name, Owner.Profile.Name, (select id, subject from cases limit 1 order by createdDate desc ) from contact];
    }
}

Lightning Controller

({
   init: function (component, event, helper) {
        component.set('v.mycolumns', [
                {label: 'Contact Name', fieldName: 'Name', type: 'text'},
                {label: 'Phone', fieldName: 'Phone', type: 'phone'},
                {label: 'Email', fieldName: 'Email', type: 'email'},
            	{label: 'Owner', fieldName: 'Owner_Name', type: 'text'},
            	{label: 'Most Recent Case', fieldName: 'Cases_0_Subject', type: 'text'}
            ]);
        helper.getData(component, event, 'getContacts', 'mydata');
    }
})

Helper

({
    flattenObject : function(propName, obj)
    {
        var flatObject = [];
        
        for(var prop in obj)
        {
            //if this property is an object, we need to flatten again
            var propIsNumber = isNaN(propName);
            var preAppend = propIsNumber ? propName+'_' : '';
            if(typeof obj[prop] == 'object')
            {
                flatObject[preAppend+prop] = Object.assign(flatObject, this.flattenObject(preAppend+prop,obj[prop]) );

            }    
            else
            {
                flatObject[preAppend+prop] = obj[prop];
            }
        }
        return flatObject;
    },
    
	flattenQueryResult : function(listOfObjects) {
        if(typeof listOfObjects != 'Array') 
        {
        	var listOfObjects = [listOfObjects];
        }
        
        console.log('List of Objects is now....');
        console.log(listOfObjects);
        for(var i = 0; i < listOfObjects.length; i++)
        {
            var obj = listOfObjects[i];
            for(var prop in obj)
            {      
                if(!obj.hasOwnProperty(prop)) continue;
                if(typeof obj[prop] == 'object' && typeof obj[prop] != 'Array')
                {
					obj = Object.assign(obj, this.flattenObject(prop,obj[prop]));
                }
                else if(typeof obj[prop] == 'Array')
                {
                    for(var j = 0; j < obj[prop].length; j++)
                    {
                        obj[prop+'_'+j] = Object.assign(obj,this.flattenObject(prop,obj[prop]));
                    }
                }
        	}
        }
        return listOfObjects;
    },
    getInfo : function(component, event, methodName, targetAttribute) {
        var action = component.get('c.'+methodName);
        action.setCallback(this, $A.getCallback(function (response) {
            var state = response.getState();
            if (state === "SUCCESS") {
                console.log('Got Raw Response for ' + methodName + ' ' + targetAttribute);
                console.log(response.getReturnValue());
                
                var flattenedObject = this.flattenQueryResult(response.getReturnValue());
                
                component.set('v.'+targetAttribute, flattenedObject);
                
                console.log(flattenedObject);
            } else if (state === "ERROR") {
                var errors = response.getError();
                console.error(errors);
            }
        }));
        $A.enqueueAction(action);
    }
})

Component (Sorry my code highlighter didn’t like trying to parse this)

<aura:component controller=”ManageContactsController” implements=”forceCommunity:availableForAllPageTypes” access=”global”>
<aura:attribute name=”mydata” type=”Object”/>
<aura:attribute name=”mycolumns” type=”List”/>
<aura:handler name=”init” value=”{! this }” action=”{! c.init }”/>
<h3>Contacts (With Sharing Applied)</h3>
<lightning:datatable data=”{! v.mydata }”
columns=”{! v.mycolumns }”
keyField=”Id”
hideCheckboxColumn=”true”/>
</aura:component>

Result

Hope this helps!


Lightning Update List of Records Workaround (Quick Fix)

I’ve been doing some work with Salesforce Lightning, and so far it is certainly proving… challenging. I ran into an issue the other day to which I could find no obvious solution. I was attempting to pass a set of records from my javascript controller to the Apex controller for upsert. However it was throwing an error about ‘upsert not allowed on generic sObject list’ or something of that nature when the list of sObjects was in-fact defined as a specific type. After messing around with some various attempts at casting the list and modifying objects in the javascript controller before passing to the apex to have types I couldn’t find an elegant solution. Instead I found a workaround of simply creating a new list of the proper object type and adding the passed in records to it. I feel like there is probably a ‘proper’ way to make this work, but it works for me, so I figured I’d share.

//***************** Helper *************************//
	saveMappingFields : function(component,fieldObjects,callback)
	{

        var action = component.get("c.saveMappingFields");
        action.setParams({
            fieldObjects: fieldObjects
        });        
        action.setCallback(this, function(actionResult){
         
            if (typeof callback === "function") {
            	callback(actionResult);
            }
        });  
        
        $A.enqueueAction(action);             
	}
	
//**************** Apex Controller **********************//
//FAILS
@AuraEnabled
global static string saveMappingFields(list<Mapping_Field__c> fieldObjects)
{
	list<database.upsertResult> saveFieldResults = database.upsert(fieldObjects,false);	
}

//WORKS
@AuraEnabled
global static string saveMappingFields(list<Mapping_Field__c> fieldObjects)
{
	list<Mapping_Field__c> fixedMappingFields = new list<Mapping_Field__c>(fieldObjects);
	
	list<database.upsertResult> saveFieldResults = database.upsert(fixedMappingFields,false);	
}

Dynamic Apex Invocation/Callbacks

So I’ve been working on that DeepClone class and it occurred to me that whatever invokes that class might like to know when the process is done (so maybe it can do something with those created records). Seeing as the DeepClone is by it’s very nature asynchronous that presents a problem, since the caller cannot sit and wait for process to complete. You know what other language has to deal with async issues a lot? Javascript. In Javascript we often solve this problem with a ‘callback’ function (I know callbacks are old and busted, promises are the new hotness but bare with me here), where in you call your asynchronous function and tell it what to call when it’s done. Most often that is done by passing in the actual function code instead of just the name, but both are viable. Here is an example of what both might look like.

var someData = 'data to give to async function';

//first type of invocation passes in an actual function as the callback. 
asyncThing(someData,function(result){
	console.log('I passed in a function directly!' + result);
});

//second type of invocation passes in the name of a function to call instead
asyncThing(someData,'onCompleteHandler');

function onCompleteHandler(result)
{
	console.log('I passed in the name of a function to call and that happened' + result);
}

function asyncThing(data,callback)
{
	//async code here, maybe a callout or something.
	var data = 'probably  a status code or the fetched data would go here';
	
	//if our callback is a function, then just straight up invoke it
	if(typeof callback == 'function')
	{
		callback(data);
	}
	//if our callback is a string, then dynamically invoke it
	else if(typeof callback == 'string')
	{
		window[callback](data);
	}
}

So yeah, javascript is cool, it has callbacks. What does this have to do with Apex? Apex is strongly typed, you can’t just go around passing around functions as arguments, and you sure as hell can’t do dynamic invocation… or can you? Behold, by abusing the tooling api, I give you a basic implementation of a dynamic Apex callback!

public HttpResponse invokeCallback(string callback, string dataString)
{
	HttpResponse res = new HttpResponse();
	try
	{
		string functionCall = callback+'(\''+dataString,',')+'\');';
		HttpRequest req = new HttpRequest();
		req.setHeader('Authorization', 'Bearer ' + UserInfo.getSessionID());
		req.setHeader('Content-Type', 'application/json');
		string instanceURL = System.URL.getSalesforceBaseUrl().getHost().remove('-api' ).toLowerCase();
		String toolingendpoint = 'https://'+instanceURL+'/services/data/v28.0/tooling/executeAnonymous/?anonymousBody='+encodingUtil.urlEncode(functionCall,'utf-8');
		req.setEndpoint(toolingendpoint);
		req.setMethod('GET');
		
		Http h = new Http();
		res = h.send(req);
	}
	catch(exception e)
	{
		system.debug('\n\n\n\n--------------------- Error attempting callback!');
		system.debug(e);
		system.debug(res);
	}
	return res;
} 

What’s going on here? The Tooling API allows us to execute anonymous code. Normally the Tooling API is for external tools/languages to access Salesforce meta-data and perform operations. However, by accessing it via REST and passing in both the name of a class and method, and properly encoding any data you’d like to pass (strings only, no complex object types) you can provide a dynamic callback specified at runtime. We simply create a get request against the Tooling API REST endpoint, and invoke the execute anonymous method. Into that we pass the desired callback function name. So now when DeepClone for example is instantiated the caller can set a class level property of class and method it would like called when DeepClone is done doing it’s thing. It can pass back all the Id’s of the records created so then any additional work can be performed. Of course the class provided has to be public, and the method called must be static. Additionally you have to add your own org id to the allowed remote sites under security->remote site settings. Anyway, I thought this was a pretty nice way of letting your @future methods and your queueable methods to pass information back to a class so you aren’t totally left in the dark about what the results were. Enjoy!


Deep Clone (Round 2)

So a day or two ago I posted my first draft of a deep clone, which would allow easy cloning of an entire data hierarchy. It was a semi proof of concept thing with some limitations (it could only handle somewhat smaller data sets, and didn’t let you configure all or nothing inserts, or specify if you wanted to copy standard objects as well as custom or not). I was doing some thinking and I remembered hearing about the queueable interface, which allows for asynchronous processing and bigger governor limits. I started thinking about chaining queueable jobs together to allow for copying much larger data sets. Each invocation would get it’s own governor limits and could theoretically go on as long as it took since you can chain jobs infinitely. I had attempted to use queueable to solve this before but i made the mistake of trying to kick off multiple jobs per invocation (one for each related object type). This obviously didn’t work due to limits imposed on queueable. Once I thought of a way to only need one invocation per call (basically just rolling all the records that need to get cloned into one object and iterate over it) I figured I might have a shot at making this work. I took what I had written before, added a few options, and I think I’ve done it. An asynchronous deep clone that operates in distinct batches with all or nothing handling, and cleanup in case of error. This is some hot off the presses code, so there is likely some lingering bugs, but I was too excited not to share this. Feast your eyes!

public class deepClone implements Queueable {

    //global describe to hold object describe data for query building and relationship iteration
    public map<String, Schema.SObjectType> globalDescribeMap = Schema.getGlobalDescribe();
    
    //holds the data to be cloned. Keyed by object type. Contains cloneData which contains the object to clone, and some data needed for queries
    public map<string,cloneData> thisInvocationCloneMap = new map<string,cloneData>();
    
    //should the clone process be all or nothing?
    public boolean allOrNothing = false;
    
    //each iteration adds the records it creates to this property so in the event of an error we can roll it all back
    public list<id> allCreatedObjects = new list<id>();
    
    //only clone custom objects. Helps to avoid trying to clone system objects like chatter posts and such.
    public boolean onlyCloneCustomObjects = true;
    
    public static id clone(id sObjectId, boolean onlyCustomObjects, boolean allOrNothing)
    {
        
        deepClone startClone= new deepClone();
        startClone.onlyCloneCustomObjects  = onlyCustomObjects;
        startClone.allOrNothing = allOrNothing;
        
        sObject thisObject = sObjectId.getSobjectType().newSobject(sObjectId);
        cloneData thisClone = new cloneData(new list<sObject>{thisObject}, new map<id,id>());
        map<string,cloneData> cloneStartMap = new map<string,cloneData>();
        
        cloneStartMap.put(sObjectId.getSobjectType().getDescribe().getName(),thisClone);
        
        startClone.thisInvocationCloneMap = cloneStartMap;
        return System.enqueueJob(startClone);      
    }
    
    public void execute(QueueableContext context) {
        deepCloneBatched();
    }
        
    /**
    * @description Clones an object and the entire related data hierarchy. Currently only clones custom objects, but enabling standard objects is easy. It is disabled because it increases risk of hitting governor limits
    * @param sObject objectToClone the root object be be cloned. All descended custom objects will be cloned as well
    * @return list<sobject> all of the objects that were created during the clone.
    **/
    public list<id> deepCloneBatched()
    {
        map<string,cloneData> nextInvocationCloneMap = new map<string,cloneData>();
        
        //iterate over every object type in the public map
        for(string relatedObjectType : thisInvocationCloneMap.keySet())
        { 
            list<sobject> objectsToClone = thisInvocationCloneMap.get(relatedObjectType).objectsToClone;
            map<id,id> previousSourceToCloneMap = thisInvocationCloneMap.get(relatedObjectType).previousSourceToCloneMap;
            
            system.debug('\n\n\n--------------------  Cloning record ' + objectsToClone.size() + ' records');
            list<id> objectIds = new list<id>();
            list<sobject> clones = new list<sobject>();
            list<sObject> newClones = new list<sObject>();
            map<id,id> sourceToCloneMap = new map<id,id>();
            list<database.saveresult> cloneInsertResult;
                       
            //if this function has been called recursively, then the previous batch of cloned records
            //have not been inserted yet, so now they must be before we can continue. Also, in that case
            //because these are already clones, we do not need to clone them again, so we can skip that part
            if(objectsToClone[0].Id == null)
            {
                //if they don't have an id that means these records are already clones. So just insert them with no need to clone beforehand.
                cloneInsertResult = database.insert(objectsToClone,allOrNothing);

                clones.addAll(objectsToClone);
                
                for(sObject thisClone : clones)
                {
                    sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
                }
                            
                objectIds.addAll(new list<id>(previousSourceToCloneMap.keySet()));
                //get the ids of all these objects.                    
            }
            else
            {
                //get the ids of all these objects.
                for(sObject thisObj :objectsToClone)
                {
                    objectIds.add(thisObj.Id);
                }
    
                //create a select all query to get all the data for these objects since if we only got passed a basic sObject without data 
                //then the clone will be empty
                string objectDataQuery = buildSelectAllStatment(relatedObjectType);
                
                //add a where condition
                objectDataQuery += ' where id in :objectIds';
                
                //get the details of this object
                list<sObject> objectToCloneWithData = database.query(objectDataQuery);
    
                for(sObject thisObj : objectToCloneWithData)
                {              
                    sObject clonedObject = thisObj.clone(false,true,false,false);
                    clones.add(clonedObject);               
                }    
                
                //insert the clones
                cloneInsertResult = database.insert(clones,allOrNothing);
                
                for(sObject thisClone : clones)
                {
                    sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
                }
            }        
            
            for(database.saveResult saveResult :  cloneInsertResult)
            {
                if(saveResult.success)
                {
                    allCreatedObjects.add(saveResult.getId());
                }
                else if(allOrNothing)
                {
                    cleanUpError();
                    return allCreatedObjects;
                }
            }
              
            //Describes this object type so we can deduce it's child relationships
            Schema.DescribeSObjectResult objectDescribe = globalDescribeMap.get(relatedObjectType).getDescribe();
                        
            //get this objects child relationship types
            List<Schema.ChildRelationship> childRelationships = objectDescribe.getChildRelationships();
    
            system.debug('\n\n\n-------------------- ' + objectDescribe.getName() + ' has ' + childRelationships.size() + ' child relationships');
            
            //then have to iterate over every child relationship type, and every record of that type and clone them as well. 
            for(Schema.ChildRelationship thisRelationship : childRelationships)
            { 
                          
                Schema.DescribeSObjectResult childObjectDescribe = thisRelationship.getChildSObject().getDescribe();
                string relationshipField = thisRelationship.getField().getDescribe().getName();
                
                try
                {
                    system.debug('\n\n\n-------------------- Looking at ' + childObjectDescribe.getName() + ' which is a child object of ' + objectDescribe.getName());
                    
                    if(!childObjectDescribe.isCreateable() || !childObjectDescribe.isQueryable())
                    {
                        system.debug('-------------------- Object is not one of the following: queryable, creatable. Skipping attempting to clone this object');
                        continue;
                    }
                    if(onlyCloneCustomObjects && !childObjectDescribe.isCustom())
                    {
                        system.debug('-------------------- Object is not custom and custom object only clone is on. Skipping this object.');
                        continue;                   
                    }
                    if(Limits.getQueries() >= Limits.getLimitQueries())
                    {
                        system.debug('\n\n\n-------------------- Governor limits hit. Must abort.');
                        
                        //if we hit an error, and this is an all or nothing job, we have to delete what we created and abort
                        if(!allOrNothing)
                        {
                            cleanUpError();
                        }
                        return allCreatedObjects;
                    }
                    //create a select all query from the child object type
                    string childDataQuery = buildSelectAllStatment(childObjectDescribe.getName());
                    
                    //add a where condition that will only find records that are related to this record. The field which the relationship is defined is stored in the maps value
                    childDataQuery+= ' where '+relationshipField+ ' in :objectIds';
                    
                    //get the details of this object
                    list<sObject> childObjectsWithData = database.query(childDataQuery);
                    
                    system.debug('\n\n\n-------------------- Object queried. Found ' + childObjectsWithData.size() + ' records to clone');
                    
                    if(!childObjectsWithData.isEmpty())
                    {               
                        map<id,id> childRecordSourceToClone = new map<id,id>();
                        
                        for(sObject thisChildObject : childObjectsWithData)
                        {
                            childRecordSourceToClone.put(thisChildObject.Id,null);
                            
                            //clone the object
                            sObject newClone = thisChildObject.clone();
                            
                            //since the record we cloned still has the original parent id, we now need to update the clone with the id of it's cloned parent.
                            //to do that we reference the map we created above and use it to get the new cloned parent.                        
                            system.debug('\n\n\n----------- Attempting to change parent of clone....');
                            id newParentId = sourceToCloneMap.get((id) thisChildObject.get(relationshipField));
                            
                            system.debug('Old Parent: ' + thisChildObject.get(relationshipField) + ' new parent ' + newParentId);
                            
                            //write the new parent value into the record
                            newClone.put(thisRelationship.getField().getDescribe().getName(),newParentId );
                            
                            //add this new clone to the list. It will be inserted once the deepClone function is called again. I know it's a little odd to not just insert them now
                            //but it save on redudent logic in the long run.
                            newClones.add(newClone);             
                        }  
                        cloneData thisCloneData = new cloneData(newClones,childRecordSourceToClone);
                        nextInvocationCloneMap.put(childObjectDescribe.getName(),thisCloneData);                             
                    }                                       
                       
                }
                catch(exception e)
                {
                    system.debug('\n\n\n---------------------- Error attempting to clone child records of type: ' + childObjectDescribe.getName());
                    system.debug(e); 
                }            
            }          
        }
        
        system.debug('\n\n\n-------------------- Done iterating cloneable objects.');
        
        system.debug('\n\n\n-------------------- Clone Map below');
        system.debug(nextInvocationCloneMap);
        
        system.debug('\n\n\n-------------------- All created object ids thus far across this invocation');
        system.debug(allCreatedObjects);
        
        //if our map is not empty that means we have more records to clone. So queue up the next job.
        if(!nextInvocationCloneMap.isEmpty())
        {
            system.debug('\n\n\n-------------------- Clone map is not empty. Sending objects to be cloned to another job');
            
            deepClone nextIteration = new deepClone();
            nextIteration.thisInvocationCloneMap = nextInvocationCloneMap;
            nextIteration.allCreatedObjects = allCreatedObjects;
            nextIteration.onlyCloneCustomObjects  = onlyCloneCustomObjects;
            nextIteration.allOrNothing = allOrNothing;
            id  jobId = System.enqueueJob(nextIteration);       
            
            system.debug('\n\n\n-------------------- Next queable job scheduled. Id is: ' + jobId);  
        }
        
        system.debug('\n\n\n-------------------- Cloneing Done!');
        
        return allCreatedObjects;
    }
     
    /**
    * @description create a string which is a select statement for the given object type that will select all fields. Equivalent to Select * from objectName ins SQL
    * @param objectName the API name of the object which to build a query string for
    * @return string a string containing the SELECT keyword, all the fields on the specified object and the FROM clause to specify that object type. You may add your own where statements after.
    **/
    public string buildSelectAllStatment(string objectName){ return buildSelectAllStatment(objectName, new list<string>());}
    public string buildSelectAllStatment(string objectName, list<string> extraFields)
    {       
        // Initialize setup variables
        String query = 'SELECT ';
        String objectFields = String.Join(new list<string>(globalDescribeMap.get(objectName).getDescribe().fields.getMap().keySet()),',');
        if(extraFields != null)
        {
            objectFields += ','+String.Join(extraFields,',');
        }
        
        objectFields = objectFields.removeEnd(',');
        
        query += objectFields;
    
        // Add FROM statement
        query += ' FROM ' + objectName;
                 
        return query;   
    }    
    
    public void cleanUpError()
    {
        database.delete(allCreatedObjects);
    }
    
    public class cloneData
    {
        public list<sObject> objectsToClone = new list<sObject>();        
        public map<id,id> previousSourceToCloneMap = new map<id,id>();  
        
        public cloneData(list<sObject> objects, map<id,id> previousDataMap)
        {
            this.objectsToClone = objects;
            this.previousSourceToCloneMap = previousDataMap;
        }   
    }    
}    

It’ll clone your record, your records children, your records children’s children’s, and yes even your records children’s children’s children (you get the point)! Simply invoke the deepClone.clone() method with the id of the object to start the clone process at, whether you want to only copy custom objects, and if you want to use all or nothing processing. Deep Clone takes care of the rest automatically handling figuring out relationships, cloning, re-parenting, and generally being awesome. As always I’m happy to get feedback or suggestions! Enjoy!

-Kenji


Salesforce True Deep Clone, the (Im)Possible Dream

So getting back to work work (sorry alexa/amazon/echo, I’ve gotta pay for more smart devices somehow), I’ve been working on a project where there is a fairly in depth hierarchy of records. We will call them surveys, these surveys have records related to them. Those records have other records related to them, and so on. It’s a semi complicated “tree” that goes about 5 levels deep with different kinds of objects in each “branch”. Of course with such a complicated structure, but a common need to copy and modify it for a new project, the request for a better clone came floating across my desk. Now Salesforce does have a nice clone tool built  in, but it doesn’t have the ability to copy an entire hierarchy, and some preliminary searches didn’t turn up anything great either. The reason why, it’s pretty damn tricky, and governor limits can initially make it seem impossible. What I have here is an initial attempt at a ‘true deep clone’ function. You give it a record (or possibly list of records, but I wouldn’t push your luck) to clone. It will do that, and then clone then children, and re-parent them to your new clone. It will then find all those records children and clone and re-parent them as well, all the way down. Without further ado, here is the code.

    //clones a batch of records. Must all be of the same type.
    //very experemental. Small jobs only!
    public  Map<String, Schema.SObjectType> globalDescribeMap = Schema.getGlobalDescribe();    
    public static list<sObject> deepCloneBatched(list<sObject> objectsToClone) { return deepCloneBatched(objectsToClone,new map<id,id>());}
    public static list<sObject> deepCloneBatched(list<sObject> objectsToClone, map<id,id> previousSourceToCloneMap)
    {
        system.debug('\n\n\n--------------------  Cloning record ' + objectsToClone.size() + ' records');
        list<id> objectIds = new list<id>();
        list<sobject> clones = new list<sobject>();
        list<sObject> newClones = new list<sObject>();
        map<id,id> sourceToCloneMap = new map<id,id>();
        
        
        if(objectsToClone.isEmpty())
        {
            system.debug('\n\n\n-------------------- No records in set to clone. Aborting');
            return clones;
        }
                
        //if this function has been called recursively, then the previous batch of cloned records
        //have not been inserted yet, so now they must be before we can continue. Also, in that case
        //because these are already clones, we do not need to clone them again, so we can skip that part
        if(objectsToClone[0].Id == null)
        {
            //if they don't have an id that means these records are already clones. So just insert them with no need to clone beforehand.
            insert objectsToClone;
            clones.addAll(objectsToClone);
            
            for(sObject thisClone : clones)
            {
                sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
            }
                        
            objectIds.addAll(new list<id>(previousSourceToCloneMap.keySet()));
            //get the ids of all these objects.                    
        }
        else
        {
            //get the ids of all these objects.
            for(sObject thisObj :objectsToClone)
            {
                objectIds.add(thisObj.Id);
            }
            
            for(sObject thisObj : objectsToClone)
            {
                sObject clonedObject = thisObj.clone(false,true,false,false);
                clones.add(clonedObject);               
            }    
            
            //insert the clones
            insert clones;
            
            for(sObject thisClone : clones)
            {
                sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
            }
        }        

        //figure out what kind of object we are dealing with
        string relatedObjectType = objectsToClone[0].Id.getSobjectType().getDescribe().getName();
        
        //Describes this object type so we can deduce it's child relationships
        Schema.DescribeSObjectResult objectDescribe = globalDescribeMap.get(relatedObjectType).getDescribe();
                    
        //get this objects child relationship types
        List<Schema.ChildRelationship> childRelationships = objectDescribe.getChildRelationships();

        system.debug('\n\n\n-------------------- ' + objectDescribe.getName() + ' has ' + childRelationships.size() + ' child relationships');
        
        //then have to iterate over every child relationship type, and every record of that type and clone them as well. 
        for(Schema.ChildRelationship thisRelationship : childRelationships)
        { 
                      
            Schema.DescribeSObjectResult childObjectDescribe = thisRelationship.getChildSObject().getDescribe();
            string relationshipField = thisRelationship.getField().getDescribe().getName();
            
            try
            {
                system.debug('\n\n\n-------------------- Looking at ' + childObjectDescribe.getName() + ' which is a child object of ' + objectDescribe.getName());
                
                if(!childObjectDescribe.isCreateable() || !childObjectDescribe.isQueryable() || !childObjectDescribe.isCustom())
                {
                    system.debug('-------------------- Object is not one of the following: queryable, creatable, or custom. Skipping attempting to clone this object');
                    continue;
                }
                if(Limits.getQueries() >= Limits.getLimitQueries())
                {
                    system.debug('\n\n\n-------------------- Governor limits hit. Must abort.');
                    return clones;
                }
                //create a select all query from the child object type
                string childDataQuery = buildSelectAllStatment(childObjectDescribe.getName());
                
                //add a where condition that will only find records that are related to this record. The field which the relationship is defined is stored in the maps value
                childDataQuery+= ' where '+relationshipField+ ' in :objectIds';
                
                //get the details of this object
                list<sObject> childObjectsWithData = database.query(childDataQuery);
                
                if(!childObjectsWithData.isEmpty())
                {               
                    map<id,id> childRecordSourceToClone = new map<id,id>();
                    
                    for(sObject thisChildObject : childObjectsWithData)
                    {
                        childRecordSourceToClone.put(thisChildObject.Id,null);
                        
                        //clone the object
                        sObject newClone = thisChildObject.clone();
                        
                        //since the record we cloned still has the original parent id, we now need to update the clone with the id of it's cloned parent.
                        //to do that we reference the map we created above and use it to get the new cloned parent.                        
                        system.debug('\n\n\n----------- Attempting to change parent of clone....');
                        id newParentId = sourceToCloneMap.get((id) thisChildObject.get(relationshipField));
                        
                        system.debug('Old Parent: ' + thisChildObject.get(relationshipField) + ' new parent ' + newParentId);
                        
                        //write the new parent value into the record
                        newClone.put(thisRelationship.getField().getDescribe().getName(),newParentId );
                        
                        //add this new clone to the list. It will be inserted once the deepClone function is called again. I know it's a little odd to not just insert them now
                        //but it save on redudent logic in the long run.
                        newClones.add(newClone);             
                    }  
                    //now we need to call this function again, passing in the newly cloned records, so they can be inserted, as well as passing in the ids of the original records
                    //that spawned them so the next time the query can find the records that currently exist that are related to the kind of records we just cloned.                
                    clones.addAll(deepCloneBatched(newClones,childRecordSourceToClone));                                  
                }                    
            }
            catch(exception e)
            {
                system.debug('\n\n\n---------------------- Error attempting to clone child records of type: ' + childObjectDescribe.getName());
                system.debug(e); 
            }            
        }
        
        return clones;
    }
     
    /**
    * @description create a string which is a select statment for the given object type that will select all fields. Equivilent to Select * from objectName ins SQL
    * @param objectName the API name of the object which to build a query string for
    * @return string a string containing the SELECT keyword, all the fields on the specified object and the FROM clause to specify that object type. You may add your own where statments after.
    **/
    public static string buildSelectAllStatment(string objectName){ return buildSelectAllStatment(objectName, new list<string>());}
    public static string buildSelectAllStatment(string objectName, list<string> extraFields)
    {       
        // Initialize setup variables
        String query = 'SELECT ';
        String objectFields = String.Join(new list<string>(Schema.getGlobalDescribe().get(objectName).getDescribe().fields.getMap().keySet()),',');
        if(extraFields != null)
        {
            objectFields += ','+String.Join(extraFields,',');
        }
        
        objectFields = objectFields.removeEnd(',');
        
        query += objectFields;
    
        // Add FROM statement
        query += ' FROM ' + objectName;
                 
        return query;   
    }

You should be able to just copy and paste that into a class, invoke the deepCloneBatched method with the record you want to clone, and it should take care of the rest, cloning every related record that it can. It skips non custom objects for now (because I didn’t need them) but you can adjust that by removing the if condition at line 81 that says

|| !childObjectDescribe.isCustom()

And then it will also clone all the standard objects it can. Again this is kind of a ‘rough draft’ but it does seem to be working. Even cloning 111 records of several different types, I was still well under all governor limits. I’d explain more about how it works, but the comments are there, it’s 3:00 in the morning and I’m content to summarize the workings of by shouting “It’s magic. Don’t question it”, and walking off stage. Let me know if you have any clever ways to make it more efficient, which I have no doubt there is. Anyway, enjoy. I hope it helps someone out there.


Mimicking callback functions for Visualforce ActionFuncitons

Hey everyone. So I’ve got a nifty ‘approach’ for you this time around. So let me give you a quick run down on what I was doing, the problem I encountered and how I decided to solve it using what I believe to be a somewhat novel approach. The deal is that I have been working on a fairly complicated ‘one page’ app for mobile devices. What I decided to do was have one parent visualforce page, and a number of components that are hidden and shown depending on what ‘page’ the user is on. This allows for a global javascript scope to be shared between the components and also for them to have their own unique namespaces as well. I may cover the pros and cons of this architecture later.

The issue I started to have, is that I wanted some action functions on the main parent container page to be used by the components in the page. That’s fine, no problem there. The issue becomes the fact that since actionFunctions are asynchronous, and do not allow for dynamic callback functions anything that wants to invoke your actionFunction is stuck having the same oncomplete function as all the functions that may want to invoke it. So if component A and component B both want to invoke ActionFunctionZ they both are stuck with the same oncomplete function, and since it’s async there is no good way to know when it’s done. Or is there?

My solution to this problem doesn’t use any particularity amazing hidden features, just a bit of applied javascript knowledge. What we are going to do is create a javascript object in the global/top level scope. That object is going to have properties that match the names of action functions. The properties will contain the function to run once the action function is complete. Then that property will be deleted to clean up the scope for the next caller. That might sound a little whack. Here let’s check an example.

    <style>
        #contentLoading
        {
            height: 100%;
            width: 100%;
            left: 0;
            top: 0;
            overflow: hidden;
            position: fixed; 
            display: table;
            background-color: rgba(9, 9, 12, 0.5);  
              
        }
        #spinnerContainer
        {
            display: table-cell;
            vertical-align: middle;        
            width:200px;
            text-align:center;
            margin-left:auto;
            margin-right:auto;
        }

        div.spinner {
          position: relative;
          width: 54px;
          height: 54px;
          display: inline-block;
        }
        
        div.spinner div {
          width: 12%;
          height: 26%;
          background: #fff;
          position: absolute;
          left: 44.5%;
          top: 37%;
          opacity: 0;
          -webkit-animation: fade 1s linear infinite;
          -webkit-border-radius: 50px;
          -webkit-box-shadow: 0 0 3px rgba(0,0,0,0.2);
        }
        
        div.spinner div.bar1 {-webkit-transform:rotate(0deg) translate(0, -142%); -webkit-animation-delay: 0s;}    
        div.spinner div.bar2 {-webkit-transform:rotate(30deg) translate(0, -142%); -webkit-animation-delay: -0.9167s;}
        div.spinner div.bar3 {-webkit-transform:rotate(60deg) translate(0, -142%); -webkit-animation-delay: -0.833s;}
        div.spinner div.bar4 {-webkit-transform:rotate(90deg) translate(0, -142%); -webkit-animation-delay: -0.75s;}
        div.spinner div.bar5 {-webkit-transform:rotate(120deg) translate(0, -142%); -webkit-animation-delay: -0.667s;}
        div.spinner div.bar6 {-webkit-transform:rotate(150deg) translate(0, -142%); -webkit-animation-delay: -0.5833s;}
        div.spinner div.bar7 {-webkit-transform:rotate(180deg) translate(0, -142%); -webkit-animation-delay: -0.5s;}
        div.spinner div.bar8 {-webkit-transform:rotate(210deg) translate(0, -142%); -webkit-animation-delay: -0.41667s;}
        div.spinner div.bar9 {-webkit-transform:rotate(240deg) translate(0, -142%); -webkit-animation-delay: -0.333s;}
        div.spinner div.bar10 {-webkit-transform:rotate(270deg) translate(0, -142%); -webkit-animation-delay: -0.25s;}
        div.spinner div.bar11 {-webkit-transform:rotate(300deg) translate(0, -142%); -webkit-animation-delay: -0.1667s;}
        div.spinner div.bar12 {-webkit-transform:rotate(330deg) translate(0, -142%); -webkit-animation-delay: -0.0833s;}
    
         @-webkit-keyframes fade {
          from {opacity: 1;}
          to {opacity: 0.25;}
        }    	
	</style>
	
		var globalScope = new Object();
		
		function actionFunctionOnCompleteDispatcher(functionName)
		{
			console.log('Invoking callback handler for ' +functionName);
			console.log(globalScope.actionFunctionCallbacks);
			
			if(globalScope.actionFunctionCallbacks.hasOwnProperty(functionName))
			{
				console.log('Found registered function. Calling... ');
				console.log(globalScope.actionFunctionCallbacks.functionName);
				globalScope.actionFunctionCallbacks[functionName]();
				delete globalScope.actionFunctionCallbacks.functionName;
			}
			else
			{
				console.log('No callback handler found for ' + functionName);
			}    
		}         
		
		function registerActionFunctionCallback(functionName, callback)
		{
			console.log('Registering callback function for ' + functionName + ' as ' + callback);
			globalScope.actionFunctionCallbacks[functionName] = callback;
			
			console.log(globalScope.actionFunctionCallbacks);
		} 
		
		function linkActionOne(dataValue)
		{
			registerActionFunctionCallback('doThing', function(){
				console.log('Link Action Two was clicked. Then doThing action function was called. Once that was done this happened');
				alert('I was spawened from link action 1!');
			});		
			
			doThing(dataValue);
		}
		
		function linkActionTwo(dataValue)
		{
			registerActionFunctionCallback('doThing', function(){
				console.log('Link Action Two was clicked. Then doThing action function was called. Once that was done this happened');
				alert('I was spawened from link action 2!');
			});		

			doThing(dataValue);
		}

		function loading(isLoading) {
			if (isLoading) 
			{            
				$('#contentLoading').show();
			}
			else {
				$('#contentLoading').hide();
			}	
		}		
    
	
	<apex:form >
		<apex:actionFunction name="doThing" action="{!DoTheThing}" reRender="whatever" oncomplete="actionFunctionOnCompleteDispatcher('doThing');">
			<apex:param name="some_data"  value="" />
		</apex:actionFunction>
		
		<apex:actionStatus id="loading" onstart="loading(true)" onstop="loading(false)" />
	
		<a href="#" onclick="linkActionOne('Link1!')">Link One!</a>
		<a href="#" onclick="linkActionTwo('Link2!')">Link Two!</a>

		
id="contentLoading" style="display:none">
id="spinnerContainer">
class="spinner">
class="bar1">
class="bar2">
class="bar3">
class="bar4">
class="bar5">
class="bar6">
class="bar7">
class="bar8">
class="bar9">
class="bar10">
class="bar11">
class="bar12">
</div> </div> </div> </apex:form>

So what the hell is going on here? Long story short we have two links which both call the same actionFunction but have different ‘callbacks’ that happen when that actionFunction is complete. I was trying to come up with a more interesting example, but I figured I should keep it simple for sake of explanation.  You click link one, the doThing action is called. Then it calls the actionFunctionOnCompleteDispatcher function with it’s own name. That function looks to see if any callbacks have been registered for that function name. If so, it is called. If not, it just doesn’t do anything. Pretty slick eh? You may be wondering why I included all that code with the action status, the loading animation, the overlay and all that. Not really relevant to what we are doing right (though the animation is cool.)? The answer to that is (other than the fact you get a cool free loading mechanism), this approach as it stands will start to run into odd issues if you users clicks link2 before link1 has finished doing it’s work. The callback function registered by 2 would get called twice. Once the call to doThing from link1 one its going to call whatever function is registered, even if that means the click from link2 overwrote what link1 set.  I am thinking you could probably get around this by making the property of the global object an array instead of just a reference to a function. Each call would just push it’s requested callback into the array and then as they were called they would be removed from the array, but I haven’t played with this approach yet (‘I’m pretty sure it would work, I’m just too lazy and tired to write it up for this post. If there is interest I’ll do it later). In any case putting up a blocking loading screen while the action function does its work ensures that the user cannot cause any chaos by mashing links and overwriting callbacks.

The thing that is kind of cool about this which becomes clear pretty quick is that you can start ‘chaining’ callbacks. So you can have a whole series of action functions that all execute in sequence instead of just running async all over the place. So you can do things like this. Also make note of the commenting. The thing about callbacks is you can quickly end up ‘callback hell’ where it gets very difficult to track what is happening in what order. So I at least attempt to label them in an effort to stem the madness. This is just a quick copy paste from the thing I’m actually working on to give you a flavor of how the chaining can work.

//once a project has been created we need to clear out any existing temp record, then set the type of the new/current temp record as being tpe. Then finally
//we have to set the project Id on that temp record as the one we created. Then finally we can change page to the select accounts screen.
VSD_SelectProject.addNewTpeComplete = function()
{

	//order 2: happens after clearTempRecord is done 
	//once the existing temp record has been cleared and a new one is created, get a new temp record and set the type as tpe
	registerActionFunctionCallback('clearTempRecord', function(){
		setRequestType('tpe');
	});
				
	//order 3: happens after setRequestType is done 
	//once set request type is done (which means we should now have a temp record in the shared scope, then call set project id
	registerActionFunctionCallback('setRequestType', function(){
		setProjectId('{!lastCreatedProjectId}');
	});

	//order 4: happens after setProjectId is done 
	//once set project id is called and completed change the page to the new_pcr_start (poorly named, it should actually be called select_accounts
	registerActionFunctionCallback('setProjectId', function(){
		setTitle('New TPE Request');
		setSubHeader('Select Accounts');
					
		changePage('new_pcr_start');
	});
	 
	//order 1: happens first. Kicks off the callback chain defined above.                                                
	clearTempRecord();                
}

Anyway, I hope this might help some folks. I know it would be easy to get around this issue in many cases by just creating many of the ‘same’ actionFunction just with different names and callbacks but who want’s dirty repetitive code like that?

Tune in next time as I reveal the solution to an odd ‘bug’ that prevents apex:inputFields from binding to their controller values. Till next time!

-Kenji


Export SOQL Query as CSV

Hey guys,
Long time no blog! Sorry about that, been kind of busy and honestly haven’t had too many interesting tidbits to share. However, I think I have something kind of neat to show you. I had a project recently where the user wanted to be to create a custom SOQL query and export the results as a CSV file. I don’t know why they didn’t want to use regular reports and export (my guess is they figured the query may be too complex or something) but it sounded fun to write, so I didn’t argue.

Breaking this requirement down into it’s individual parts revealed the challenges I’d have to figure out solutions for:
1) Allow a user to create a custom SOQL query through the standard interface
2) Extract and iterate over the fields queried for to create the column headings
3) Properly format the query results as a CSV file
4) Provided the proper MIME type for the visualforce page to prompt the browser to download the generated file

As it turns out, most of this was pretty easy. I decided to create a custom object called ‘SOQL_Query_Export__c’ where a user could create a record then specify the object to query against, the fields to get, the where condition, order by and limit statements. This would allow for many different queries to be easily created and saved, or shared between orgs. Obviously the user would have to know how to write SOQL in the first place, but in this requirement that seemed alright. The benefit as well is that an admin could pre-write a query, then users could just run it whenever.

With my data model/object created now I set about writing the apex controller. I’ll post it, and explain it after.

public class SOQL_Export {

    public SOQL_Query_Export__c exporter     {get;set;}
    public list<sobject>        queryResults {get;set;}
    public list<string>         queryFields  {get;set;}
    public string               queryString  {get;set;}
    public string               fileName     {get;set;}
    
    public SOQL_Export(ApexPages.StandardController controller) 
    {
        //Because the fields of the exporter object are not refernced on the visualforce page we need to explicity tell the controller
        //to include them. Instead of hard coding in the names of the fields I want to reference, I simply describe the exporter object
        //and use the keyset of the fieldMap to include all the existing fields of the exporter object.
        
        //describe object
        Map<String, Schema.SObjectField> fieldMap = Schema.SOQL_Query_Export__c.sObjectType.getDescribe().fields.getMap();
        
        //create list of fields from fields map
        list<string> fields = new list<string>(fieldMap.keySet());
        
        //add fields to controller
        if(!Test.isRunningTest())
        {
            controller.addFields(fields);
        }
        //get the controller value
        exporter = (SOQL_Query_Export__c) controller.getRecord();

        //create a filename for this exported file
        fileName = exporter.name + ' ' + string.valueOf(dateTime.now());
                
        //get the proper SOQL order direction from the order direction on the exporter object (Ascending = asc, Descending = desc)
        string orderDirection = exporter.Order_Direction__c == 'Ascending' ? 'asc' : 'desc';
        
        //create a list of fields from the comma separated list the user entered in the config object
        queryFields =  exporter.fields__c.split(',');
        
        //create the query string using string appending and some ternary logic
        queryString = 'select ' + exporter.fields__c + ' from ' + exporter.object_name__c;
        queryString += exporter.where_condition__c != null ? ' where ' + exporter.where_condition__c : '';
        queryString += exporter.Order_by__c != null ? ' order by ' + exporter.Order_by__c + ' ' + orderDirection :'';
        queryString += exporter.Limit__c != null ? ' limit ' +string.valueOf(exporter.Limit__c) : ' limit 10000';
        
        //run the query
        queryResults = database.query(queryString);
        
        
    }

    //creates and returns a newline character for the CSV export. Seems kind of hacky I know, but there does not seem to be a better
    //way to generate a newline character within visualforce itself.
    public static String getNewLine() {
      return '\n';
    }
}

Because I was going to use the SOQL_Query_Export__c object as the standard controller my apex class would be an extension. This meant using the controller.addFields method (fields not explicitly added by the addFields method or referenced in the visualforce page are not available on the record passed into the controller. So if I had attempted to reference SOQL_Query_Export__c.Name without putting it in my add fields method, or referencing it on the invoking page it would not be available). Since my visualforce page was only going to be outputting CSV content, I have to manually add the fields I want to reference. I decided instead of hard coding that list, I’d make it dynamic. I did this by describing the the SOQL_Query_Export__c object and passing the fields.getMap() keyset to the controller.addFields method. Also, just as something to know, test classes cannot use the addFields method, so wrap that part in an if statement.

Next it’s just simple work of constructing a filename for the generated file, splitting the fields (so I can get an array I can loop over to generate the column headers for the CSV file). Then it’s just generating the actual query string. I  used some ternary statements since things like order by and limit are not really required. I did include a hard limit of 10000 records if one isn’t specified since that is the largest a read only collection of sobjects can be. Finally we just run the query. That last method in the class is used by the visualforce page to generate proper CSV line breaks (since you can’t do it within the page itself. Weird I know).

So now with the controller, we look at the page.

<apex:page standardController="SOQL_Query_Export__c" cache="true"  extensions="SOQL_Export" readOnly="true" showHeader="false" standardStylesheets="false" sidebar="false" contentType="application/octet-stream#{!fileName}.csv">

  <apex:repeat value="{!queryFields}" var="fieldName">{!fieldName},</apex:repeat>{!newLine}
  
  <apex:repeat value="{!queryResults}" var="record"><apex:repeat value="{!queryFields}" var="fieldName">{!record[fieldName]},</apex:repeat>{!newLine}</apex:repeat>
  

</apex:page>

I know the code looks kind of run together. That is on purpose to prevent unwanted line breaks and such in the generated CSV file. Anyway, the first line sets up the page itself obviously. Removes the stylesheets, header, footer, and turns on caching. Now there are two reasonably important things here. The readOnly attribute allows a visualforce collection to be 10000 records instead of only 1000, very useful for a query exporter. The second is the ‘contentType=”application/octet-stream#{!fileName}.csv”‘ part. That tells the browser to treat the generated content as a CSV file, which in most browsers should prompt a download. You can also see that the filename is an Apex property that was generated by the class.

With the page setup, now we just need to construct the actual CSV values. To create the headers of the file, we simply iterate over that list of fields we split in the controller, putting a comma after each one (according to CSV spec trailing commas are not a problem so I didn’t worry about them). You can see I also invoke the {!newLine} method to create a proper CSV style newline after the header row. If anyone knows of a way to generate a newline character in pure visualforce I’d love to hear it, because I couldn’t find a way.

Lastly we iterate over the query results. For each record in the query, we then iterate over each fields. Using the bracket notation we can the field from the record dynamically. Again we create a newline at the end of each record. After this on the SOQL Export object I simple created a button that invoked this page passing in its record ID. That newly opened window would provide the download and then user could then close it (I’m experimenting with ways to automatically close the window once the download is done, but it’s a low priority and any solution would be rather hacky).

There you have it. A simple SOQL query export tool. I have this packaged up, but I’m not 100% I can give that URL away right now. I’ll update this entry if it turns out I’m allowed to share it. Anyway, hope this someone, or if nothing else shows a couple neat techniques you might be able to use.


Entity is deleted on apex merge

Hey guys,

Just a little quick fix post here, a silly little bug that took me a bit of time to hunt down (probably just because I hadn’t had enough coffee yet). Anyway, the error happens when trying to merge two accounts together. I was getting the error ‘entity is deleted’. The only thing that made my code any different from other examples was that, the account I was trying to merge was being selected by picking it from a lookup on the master.  The basic code looked like this (masterAccount was being set by the constructor for the class, so it is already setup properly).

            try
            {
                Account subAccount = new Account(id=masterAccount.Merge_With__c);
                merge masterAccount subAccount;
                mergeResult = 'Merge successful';
            }
            catch(exception e)
            {
                mergeResult = e.getMessage();
            }

Can you spot the problem here? Yup, because the Merge_With__c field on the master account would now be referencing an account that doesn’t exist (since after a merge the child records get removed) it was throwing that error. So simple once you realize it. Of course the fix for it is pretty easy as well. Just null out the lookup field before the merge call.

            try
            {
                Account subAccount = new Account(id=masterAccount.Merge_With__c);
                masterAccount.Merge_With__c = null;
                merge masterAccount subAccount;
                mergeResult = 'Merge successful';
            }
            catch(exception e)
            {
                mergeResult = e.getMessage();
            }

There you have it. I realize this is probably kind of a ‘duh’ post but it had me stumped for a few minutes, and I’m mostly just trying to get back into the swing of blogging more regularly, so I figured I’d start with something easy. ‘Till next time!



Stripping Nulls from a JSON object in Apex

NOTE: If you don’t’ want to read the wall of text/synopsis/description just scroll to the bottom. The function you need is there.

I feel dirty. This is the grossest hack I have had to write in a while, but it is also too useful not to share (I think). Salesforce did us an awesome favor by introducing the JSON.serialize utility, it can take any object and serialize it into JSON which is great! The only problem is that you have no control over the output JSON, the method takes no params except for the source object. Normally this wouldn’t be a big deal, I mean there isn’t a lot to customize about JSON usually, it just is what it is. There is however one case when you may want to control the output, and that is in the case of nulls. You see most of the time when you are sending JSON to a remote service, if you have a param specified as null, it will just skip over it as it should. Some of the stupider APIs try and process that null as if it were a value. This is especially annoying when the API has optional parameters and you are using a language like Apex which being strongly types makes it very difficult to modify an object during run time to remove a property. For example, say I am ordering a pizza, via some kind of awesome pizza ordering API. The API might take a size, some toppings, and a desired delivery time (for future deliveries). Their API documentation states that delivery time is an optional param, and if not specified it will be delivered as soon as possible, which is nice. So I write my little class in apex

    public class pizzaOrder
    {
    	public string size;
    	public list<string> toppings;
    	public datetime prefferedDeliveryTime;
    
    }
    
    public static string orderPizza(string size, list<string> toppings, datetime prefferedDeliveryTime)
    {
    	pizzaOrder thisOrder = new pizzaOrder();
    	thisOrder.size = size;
    	thisOrder.toppings = toppings;
    	thisOrder.prefferedDeliveryTime	= prefferedDeliveryTime;
    	
    	string jsonOrderString = JSON.serialize(thisOrder);
    	
   
    }
    
    list<string> toppings = new list<string>();
    toppings.add('cheese');
    toppings.add('black olives');
    toppings.add('jalepenos');
                     
    orderPizza('large', toppings, null);

And your resulting JSON looks like

{“toppings”:[“cheese”,”black olives”,”jalepenos”],”size”:”large”,”prefferedDeliveryTime”:null}

Which in would work beautifully, unless the Pizza API is setup to treat any present key in the JSON object as an actual value, which in that case would be null. The API would freak out saying that null isn’t a valid datetime, and you are yelling at the screen trying to figure out why the stupid API can’t figure out that if an optional param has a null value, to just skip it instead of trying to evaluate it.

Now in this little example you could easily work around the issue by just specifying the prefferedDeliveryTime as the current date time if the user didn’t pass one in. Not a big deal. However, what if there was not a valid default value to use? In my recent problem there is an optional account number I can pass in to the API. If I pass it in, it uses that. If I don’t, it uses the account number setup in the system. So while I want to support the ability to pass in an account number, if the user doesn’t enter one my app will blow up because when the API encounters a null value for that optional param it explodes. I can’t not have a property for the account number because I might need it, but including it as a null (the user just wants to use the default, which Salesforce has no idea what is) makes the API fail. Ok, whew, so now hopefully we all understand the problem. Now what the hell do we do about it?

While trying to solve this, I explored a few different options. At first I thought of deserialize the JSON object back into a generic object (map<string,object>) and check for nulls in any of the key/value pairs, remove them then serialize the result. This failed due to difficulties with detecting the type of object the value was (tons of ‘unable to convert list<any> to map<string,object> errors that I wasn’t’ able to resolve). Of course you also have the recursion issue since you’ll need to look at every element in the entire object which could be infinity deep/complex so that adds another layer of complexity. Not impossible, but probably not super efficient and I couldn’t even get it to work. Best of luck if anyone else tries.

The next solution I investigated was trying to write my own custom JSON generator that would just not put nulls in the object in the first place. This too quickly fell apart, because I needed a generic function that could take string or object (not both, just a generic thing of some kind) and turn it into JSON, since this function would have to be used to strip nulls from about 15 different API calls. I didn’t look super hard at this because all the code I saw looked really messy and I just didn’t like it.

My solution that I finally decided to go for, while gross, dirty, hackish and probably earned me a spot in programmer hell is also simple and efficient. Once I remembered that JSON is just a string, and can be manipulated as such, I started thinking about maybe using regex (yes I am aware when you solve one problem with regex now you have two) to just strip out nulls. Of course then you have to worry about cleaning up syntax (extra commas, commas against braces, etc) when just just rip elements out of the JSON string, but I think I’ve got a little function here that will do the job, at least until salesforce offeres a ‘Don’t serialize nulls’ option in their JSON serializer.

    public static string stripJsonNulls(string JsonString)
    {

    	if(JsonString != null)   	
    	{
			JsonString = JsonString.replaceAll('\"[^\"]*\":null',''); //basic removeal of null values
			JsonString = JsonString.replaceAll(',{2,}', ','); //remove duplicate/multiple commas
			JsonString = JsonString.replace('{,', '{'); //prevent opening brace from having a comma after it
			JsonString = JsonString.replace(',}', '}'); //prevent closing brace from having a comma before it
			JsonString = JsonString.replace('[,', '['); //prevent opening bracket from having a comma after it
			JsonString = JsonString.replace(',]', ']'); //prevent closing bracket from having a comma before it
    	}
  	
	return JsonString;
    }

Which after running on our previously generated JSON we get

{“toppings”:[“cheese”,”black olives”,”jalepenos”],”size”:”large”}

Notice, no null prefferedDeliveryTime key. It’s  not null, its just non existent. So there you have it, 6 lines of find and replace to remove nulls from your JSON object. Yes, you could combine them and probably make it a tad more efficient. I went for readability here. So sue me. Anyway, hope this helps someone out there, and if you end up using this, I’m sure I’ll see you in programmer hell at some point. Also, if anyone can make my initial idea of recursively spidering the JSON object and rebuilding it as a map of <string,object> without the nulls, I’d be most impressed.



Automatic Resize of Embedded/Inline Visualforce pages

I’m pretty sure at one time or another, most of us developers have made a sweet inline visualforce page that lives on the object detail page. Problem is since it’s all dynamic and cool we don’t know how ‘tall’ the content might end up being. We are basically stuck with having to guess at a decent height for the VF page and enable scrollbars, which kind of blows. Well not anymore (some exceptions apply)! Expanding on my previous post about using HTML5 postmessaging to move data between embedded visualforce pages and the standard salesforce domain I’ve come up with a first release of a ‘framework’ of sorts (term used very liberally). First an example. Lets check out what I’m talking about using my wordcloud app to demonstrate.

BEFORE

Scrollbars are Sad

 

AFTER
Woot woot no scrolling :D

So how did I accomplish this? The trick, as you may be able to guess by now is the HTML5 PostMessaging feature that I wrote about. By enabling communication between the frame (the visualforce page) and the parent (the detail page) it’s fairly simple to have the frame report it’s size to the parent, and have the parent adjust the height of the frame accordingly. Of course that isn’t everything my framework can do, oh no. It allows for passing of arbitrary commands/data between the two, so you can extend the functionality to pass any kind of commands/dom manipulations you might want between them. Of course the caveat is that this only works on newer browsers since postMessaging is fairly new. Also you’ll need to enable the sidebar on all pages, since that is where the proxy component lives. If you want to play with this install my alpha package from

https://login.salesforce.com/packaging/installPackage.apexp?p0=04ti0000000TY0J

Then include the home page component in the narrow/sidebar and enable the sidebar for all pages (setup->user interface->Show Custom Sidebar Components on All Pages). After that you just include these two components on any page you want to resize itself when embedded. The first component is the core piece that talks to the home page component. The second one is just some code that utilizes that framework to cause the iframe to resize. I’d put them at the bottom of your visualforce page.

<c:vfProxy />

<c:vfProxy_autoHeight />

I’m planning on possibly adding more ‘plugins’ for my framework to do other common handy things so if you have any ideas for potential plugins let me know.


Communication Between Visualforce Pages and Standard Pages Via Iframes

So I had what I thought was going to be a quick easy project recently, an idea spawned from some idea spit-balling at the office. Basically we wanted to create a page where you could queue up records, and it would show you the edit screen for them one at a time. Hit save, next record comes up in edit mode, save it, next record comes up in edit mode. Sounds a lot like the inline list view editor I know, but this would give you the full edit screen, ability to modify rich text areas, etc. I dunno, some other people wanted it. I decided it sounded fun and easy enough to build so I decided to take a crack at it.

I had a few internal conditions I wanted to satisfy as well. I wanted it to be light weight, fast, and not require much if any configuration to work for any object in Salesforce. It should be flexible, easy and straight forward. I didn’t want some huge apex controller backing it, required custom buttons or anything weird like that. This was a simple project and it should have a simple solution. The approach I decided to go with was to create a visualforce page that received a list of Ids to edit, it would then request the record for each of those ideas via an iframe. The iframe avoids having to run any messy queries, page layouts are respected, all that good stuff. Using some URL hacks I could tell it to load the record straight in edit mode (if you append a /e on the end of a record url it loads straight into edit mode), and even set the return URL (where the browser is directed after the user saves/cancels), seemed easy enough. Have the frame load the first record, set the return url to being the edit mode of the next record, and boom just follow that process until you exhaust the list. I forgot one crucial thing, well not so much forgot as ignored and hoped I could figure a way around it. Cross domain iframe security.

Let’s back up one second. If you are not aware, when using iframes if the parent frame and the child frame are both in the same domain you can use javacript to transverse them, inspect the child, share variables in the global JS scope, all that good stuff. If they are not on the same domain about the only thing you can do is load the frame, and tell when it’s source changes/reloads. That’s it. You can’t insepct it, you can’t modify it, you have extremely minimal control. Salesforce hosts visualforce pages and standard object pages on different domains. Bummer. This means I could not modify the behavior of the save button, and I couldn’t know if the user saved, cancelled, or anything about what they were doing in the frame. This means that after the got passed the second record (the first I could load into the frame directly, and then by setting the returnURL I could get them to go to the second one) I could no longer control where the user was going to easily. Sure I could change the src on the iframe itself but the only time I could do that was during a load event, which quickly gets you stuck in a loop. I know it sounds like it should be easy, but take a whack at it, I was unable to figure out an elegant solution after a couple hours. Bottom line, this wasn’t going to be as easy as I thought.

So here I was kind of stuck, I could load the first record and even set it to saving and returning to the second record in edit mode, but that was about it. The iframe source doesn’t change on record saves, and there isn’t any way to modify the behavior of the save button to return to the next record URL. I did come up with a hacky version that by detecting every other page load I could change the source to the assumed next record. EX the first record loads and the user saves. Code detects 2nd load event (which is now the detail view of the first record after the save) and changes the source to the next record. Waits for another reload and forwards them again. It worked, but it was hacky, unsatisfying and required waiting for the detail version of the page to load before forwarding the user onto the next edit mode. There had to be a better way, but I’d need to somehow modify the contents of the iframe.

I can’t say exactly how I remembered, but suddenly a few pieces of information kind of hit me all at once.
1) Sidebar components can run custom javascript and run off the core salesforce domain, not the visualforce domain.
2) HTML5 spec gives us a postMessage API to frames from different domains to pass information to each other.
3) Javascript can run arbitrary code using eval.

Could I stitch all this information together to build some kind of javascript proxy to run code on the core domain on behalf of my visualforce page? Indeed I could, and now I’ll show you how. Gather round; Kenji’s got a cool new trick to show you.

First you are going to need to enable custom sidebar components on all pages. Head to:

setup-> user interface -> check: Show Custom Sidebar Components on All Pages

Now create a narrow HTML homepage component. Slap this code in there. Yeah I know eval is bad, you can of course replace this with specific function calls if you want but I wanted to play around. Besides SF is a mostly secure environment.

<div id="jsProxy" style="width:100%;height:100px"></div>
<script>
    function receiveMessage(event) {
		var result = new Object();
		result.success = true;
		result.callback = event.data.callback;
		result.command = event.data.command;
		
        try {
            document.getElementById("jsProxy").innerHTML = 'JS proxy received command: ' + event.data.command;
            result.data = eval(event.data.command);
         
        } catch (ex) {
			result.data = ex.message;
			document.getElementById("jsProxy").innerHTML = 'Error running command: ' + ex.message;
        }
		event.source.postMessage(result,event.origin);
    }
    window.addEventListener("message", receiveMessage, false);
</script>

What we just did is created an event listener that listens for any postmessages coming in. It hands them off to our receiveMessage function. That will set the content of our div so we can see what got passed in (mostly for debugging) and then attempts to run the received message data as code. This received code is now running in the context of the core salesforce page, not within visualforce. With this we can modify elements on the standard object interface.

So now you might be saying, okay great, you can pass code to the page to run, but still, how are you going to change the returnURL? It’s too late to try and pass that into the URL by the time the page loads right? You are right, but thankfully Salesforce usually just takes URL parameters and jams them into hidden fields and uses them for whatever from there. So it’s just a matter of changing the value of the hidden ‘retURL’ field. So I pass to my function a bit of code that locates that field and modifies it to be the url of the next record in edit mode. Something like this.

    //where to go next
    var nextRecord = multiEdit.objectIds[parseInt(indexPosition+1,10)];
    
    //create command to pass to listener to run
    message = 'document.getElementById("retURL").value = "/'+nextRecord+'/e"';
    
    //create container variable so we could specify a callback if we wanted to
    var setReturnUrl = new multiEdit.postMessageData(message,null);
    
    //send the command
    iframeMessageWindow.postMessage(setReturnUrl,'*'); 
                       
    multiEdit.receiveMessage = function(event)
    {
        if(event.data.callback != null)
        {
            window.multiEdit[event.data.callback](event);
        }
        
    }
    
    multiEdit.postMessageData = function(command, callback)
    {
         this.command = command;
         this.callback = callback;   
    }
   

    window.addEventListener("message",multiEdit.receiveMessage, false);

Boom, now the code for changing the field is passed via the postMessage API to the homepage component that has the ability to run said code and operate on the DOM within the iframe. Pretty slick eh? With this technique the border between visualforce pages and standard pages has pretty much been smashed. Sure it needs a bit of refining, but overall I think this could allow for some pretty cool stuff.

BTW I’ll probably be releasing the utility that I wrote using this approach fairly soon. Till next time!


Salesforce Orchestra CMS Controller Extensions

So I’ve been working with Orchestra CMS for Salesforce recently, and for those who end up having to use it, I have a few tips.

1) If you intend on using jQuery (a newer version than the one they include) include it, and put it in no conflict mode. Newer versions of jQuery will break the admin interface (mostly around trying to publish content) so you absolutely must put it in no conflict mode. This one took me a while to debug.

2) While not official supported, you can use controller extensions in your templates. However the class, and all contained methods MUST be global. If they are not, again you will break the admin interface. This was kind of obvious after the fact, but took me well over a week to stumble across how to fix it. The constructor for the extension takes a cms.CoreController object. As an alternative if you don’t want to mess with extensions what you can do is use the apex:include to include another page that has the controller set to whatever you want. the included page does not need to have the CMS controller as the primary controller, so you can do whatever you want there. I might actually recommend that approach as Orchestra’s official stance is that they do not support extensions, and even though I HAD it working, today I am noticing it act a little buggy (not able to add or save new content to a page).

3) Don’t be araid to use HTML component types in your pages (individual items derived from your page template) to call javascript functions stored in your template. In fact I found that you cannot call remoting functions from within an HTML component directly, but you can call a function which invokes a remoting function.

So if we combine the above techniques we’d have a controller that looks like this

global class DetailTemplateController
{
    global DetailTemplateController(cms.CoreController stdController) {

    }

    @remoteAction
    global static list<user> getUsers()
    {
        return [select id, name, title, FullPhotoUrl from user ];
    }
}

And your  template might then look something like this

<apex:page id="DetailOne" controller="cms.CoreController" standardStylesheets="false" showHeader="false" sidebar="false" extensions="DetailTemplateController" >
	<apex:composition template="{!page_template_reference}">
		<apex:define name="header"> 
			<link href="//ajax.aspnetcdn.com/ajax/jquery.ui/1.10.3/themes/smoothness/jquery-ui.min.css" rel='stylesheet' />

			<script src="//ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
			<script> var jqNew = jQuery.noConflict();</script> 
			<script src="//ajax.googleapis.com/ajax/libs/jqueryui/1.10.3/jquery-ui.min.js"></script> 

			<script>
        	        var website = new Object();
			jqNew( document ).ready(function() {
				console.log('jQuery loaded');
			});

			website.buildUserTable = function()
			{
				//remoting request
				Visualforce.remoting.Manager.invokeAction(
					'{!$RemoteAction.DetailTemplateController.getUsers}', 
					function(result, event){
						if (event.type === 'exception') 
						{
							console.log(event.message);
						} 
						else 
						{
							var cols = 0;

							var tbl = jqNew('#bioTable > tbody');
							var tr;
							for(var i = 0; i < result.length; i++)
							{
								if(cols == 0){tr = jqNew('<tr></tr>');}                              

								var td = jqNew('<td></td>');

								var img = jqNew('<img class="profilePhoto">');
								img.attr('src',result[i].FullPhotoUrl);
								img.attr('title',result[i].Title);
								img.attr('alt',result[i].Name);
								img.data("record", result[i]);
								img.attr('id',result[i].Id);

								td.append(img);

								tr.append(td);

								if(cols == 2 || i == result.length-1){
									tbl.append(tr);
									cols = -1;
								}
								cols++;

							}

						}
					})			
			}
			</script>
		</apex:define>
		<apex:define name="body">
			<div class="container" id="mainContainer">
				<div class="pageContent">
					<div id="header">
						<apex:include pageName="Header"/>
						<div id="pageTitle">
							<cms:Panel panelName="PageTitle" panelController="{!controller}" panelheight="50px" panelwidth="200px"/>
						</div>
					</div>
					<div id="pageBody">
						<p>
							<cms:Panel panelName="PageContentArea" panelController="{!controller}"  panelheight="200px" panelwidth="400px" />
						</p>
						<div class="clearfloat"></div>
					</div>

					<!-- end .content --> 
				</div>
			</div>
			<div id="footer_push"></div>
			<div id="footer">
				<apex:include pageName="Footer"/>
			</div>
		</apex:define>
	</apex:composition>
</apex:page>

Then in our page we can add an HTML content area and include

<table id="bioTable">
	<tbody></tbody>
</table>
<script>website.buildUserTable();</script>

So when that page loads it will draw that table and invoke the website.buildUserTable function. That function in turns calls the remoting method in our detailTemplateController extension that we created. The query runs, returns the user data, which is then used to create the rows of the table that are then appended to the #bioTable’s body. It’s a pretty slick approach that seems to work well for me. Your mileage may vary, but at least rest assured you can use your own version of javascript, and you can use controller extensions, which I wasn’t sure about when I started working it. Till next time.


Salesforce Live Agent Review & Customization

So you are building a new website hosted on force.com and your boss says

‘Oh and we gotta have chat. Everyone has chat these days, we need it.’

Agree, disagree, doesn’t matter. You are doing it because rent is coming due and you can’t tell him that that idea is as bad as his comb-over (all purely hypothetical of course). So you start thinking about writing your own chat app because it sounds like fun (some UI options, push notifications, some cool chances to use javascript remoting maybe?), then realize you don’t have time for fun because this thing is due in like a week. So you frantically google around a bit and realize,

‘Wait a minute, Salesforce has it’s own native chat app “live agent”. That could probably do most of my work for me!’

Only question is can you hack at it enough to make it do what you need? Does it have a pre-chat form? Does it have a post chat survey? Does it save logs? How about a queue system? The short answer is yes. I was actually blown away at how much time and energy they put into the live agent chat. It can do pretty much everything any reasonable person would ask of it, and a few unreasonable things as well.  As a developer though you just want to get it up and running as fast as possible so you can play with all the bells and whistles right? Manuals are readme’s are for suckers, let’s just throw it on a page somewhere as a POC to make the boss man happy. So how do you go about doing that, what does the process look like? In a nutshell, it’s going to go like this.

1) Get the Salesforce liveagent trial for your org.
2) Log into Salesforce and go to customize->live agent.
3) Create a deployment, and a button
4) Paste the scripts they give you onto your webpage
5) Feel slightly disappointed that it was too easy and you didn’t get to derive any satisfaction from solving a problem yourself.

OH! Before I forget, I felt really dumb when I first set this up because I put it all in place and couldn’t figure out how to actually let an agent login to the chat. I kept seeing something about a console view, but the console didn’t say anything about chat. It turns out you just have to create a console view (you know like you do for cases or whatever) and in the lower right corner there is a chat login in. It’s all within Salesforce, there is no other service to authenticate to or anything, which is pretty sweet.

So now you show your boss, and he’s like ‘Yeah that’s cool, but we’d like to know who we are talking to via a pre-chat form, and if they don’t exist in our system it would be cool to create a lead for them. Also if nobody is online we should just redirect them to a contact us form or something’. Ah-hah finally something fun! So you probably saw while creating your chat button there was a lookup field to a pre-chat form, but didn’t really have anything to populate that with. Nor do you have any idea how to build one. Well it turns out Salesforce actually has a pretty robust API centered around their chat, and a couple ways to pass information to the agent responding to the chat. I’m going to focus on the ‘simple’ form based approach since I haven’t used their javascript API they offer. So this pre-chat form can actually perform lookups, pass data, save information into the chat record itself, it’s pretty wild, but a little confusing. So first check out the pre-chat form sample Salesforce provides, it gives a good basic understanding of how it works.

Salesforce Pre-Chat Form Sample

You can see that you create fields, then some hidden fields take the values of the user entered fields and run queries or pass that info along to the console. You can populate fields on the chat record by creating ‘fields’ that look like this

<input type=”hidden” name=”liveagent.prechat.save:Email”  value=”Provided_Email__c” />

That says, take the value of the field called liveagent.prechat.Email from this form and save it into the Provided_Email__c field on the chat history object. Of course you could reference another hidden field in that name attribute and use code to set the value of that other hidden field allowing you to pass in basically whatever you like to the chat history. You can create custom fields and pass values to them, as well as passing values to the standard fields too.

But now we need to solve our issue of lookuping up the contact or lead based on email, and if one doesn’t exist creating one on the fly and returning that.  There are a few ways you could take this, but since I love javascript I decided to go with the javascript remoting approach. The user is going to enter their name and email, when they click submit instead of actually submitting the form a remoting function is going to run that runs our query and returns the result, or creates one. With that data we will populate the contactid or leadid field (depending what kind of record it is) and pass that along to the console by using javascript to submit the form once the function has run. Additionally using the ability to detect if there are agents online or not, we can change the behavior of the buttons (well actually we just show and hide different buttons) to send the user elsewhere if there is nobody online. It looks something like this.

<apex:page showHeader="false" controller="PreChatController">

<apex:variable var="deploymentId" value="572c0000000CaRW" />
<apex:variable var="orgId" value="00Dc0000001M6Ix" />
<apex:variable var="buttonId" value="573c0000000CaSe" />

<!-- This script takes the endpoint URL parameter passed from the deployment page and makes
it the action for the form -->
<script type='text/javascript' src='https://c.la7cs.salesforceliveagent.com/content/g/js/29.0/deployment.js'></script>

<script type='text/javascript'>
    liveagent.init('https://d.la7cs.salesforceliveagent.com/chat', '{!deploymentId}', '{!orgId}');
</script>

<script type="text/javascript">
    (function() 
    {
        function handlePageLoad()
        {
            var endpointMatcher = new RegExp("[\\?\\&]endpoint=([^&#]*)");
            document.getElementById('prechatForm').setAttribute('action',
            decodeURIComponent(endpointMatcher.exec(document.location.search)[1]));
        } 
        if (window.addEventListener) 
        {
            window.addEventListener('load', handlePageLoad, false);
        } 
        else 
        { 
            window.attachEvent('onload', handlePageLoad, false);
        }
    })();

    if (!window._laq) { window._laq = []; }

    window._laq.push(function()
    {
        liveagent.showWhenOnline('{!buttonId}', document.getElementById('prechat_submit'));
        liveagent.showWhenOffline('{!buttonId}', document.getElementById('liveagent_button_offline_{!buttonId}'));
    });

    function getLeadOrContact()
    {
        console.log('Getting lead or contact');
        var emailAddr = document.getElementById('email').value.trim();
        var fname = document.getElementById('name').value.trim();
        var phone = document.getElementById('phone').value.trim();

        try
        {
            Visualforce.remoting.Manager.invokeAction(
                '{!$RemoteAction.PreChatController.findLeadOrContactByEmail}', 
                fname,
                emailAddr,
                phone, 
                function(result, event)
                {
                    if (event.status) 
                    {
                        console.log(result);
                        if(result.Id.substring(0,3) === '003')
                        {
                            document.getElementById('contactid').value = result.Id;
                        }
                        else if(result.Id.substring(0,3) === '00Q')
                        {
                            document.getElementById('leadid').value = result.Id;
                        }
                        document.forms["prechatForm"].submit();

                        return true;
                    } 
                }, 
                {escape: false}
            );
        }
        catch(ex)
        {
            alert(ex.message);
            console.log(ex);
            return false;
        }
        return false;
    }   

</script>
<style>
body
{
    background-color:#f4f4f4;
}
#chatFormDiv
{
    width:200px;
    text-align:center;
    padding:5px;
}
#chatHeader
{
    color:#6d6d6d;
    font-size:18px;
    font-weight:bold;
}
label
{
    width:150px;
    font-weight:bold;
}
input[type=text], textarea
{
    width:200px;
    background: #f3f3f3; /* Old browsers */
    background: -moz-linear-gradient(top, #f3f3f3 0%, #ffffff 100%); /* FF3.6+ */
    background: -webkit-gradient(linear, left top, left bottom, color-stop(0%,#f3f3f3), color-stop(100%,#ffffff)); /* Chrome,Safari4+ */
    background: -webkit-linear-gradient(top, #f3f3f3 0%,#ffffff 100%); /* Chrome10+,Safari5.1+ */
    background: -o-linear-gradient(top, #f3f3f3 0%,#ffffff 100%); /* Opera 11.10+ */
    background: -ms-linear-gradient(top, #f3f3f3 0%,#ffffff 100%); /* IE10+ */
    background: linear-gradient(to bottom, #f3f3f3 0%,#ffffff 100%); /* W3C */
    filter: progid:DXImageTransform.Microsoft.gradient( startColorstr='#f3f3f3', endColorstr='#ffffff',GradientType=0 ); /* IE6-9 */    
    border-color: #dedede;
    border-top-color: #d3d3d3;
}
textarea
{
    height:140px;
}
.chatStatusDiv
{
    display:none;
}
</style>

<div id="chatFormDiv">
    <img src="{!URLFOR($Resource.BeaconWebsite,'img/chatIconSmallGrey.png')}" /> <span id="chatHeader">Chat</span><br/>
    <hr />

    <form method='post' id='prechatForm' onsubmit="return false;" action="https://15af.la7cs.salesforceliveagent.com/content/s/chat?language=en_US#deployment_id={!deploymentId}&org_id={!orgId}&button_id={!buttonId}">

    <input type='text' name='liveagent.prechat.name' id='name' placeholder="Your Name" required="required"/><br />

    <input type='text' name='liveagent.prechat:Email' id='email' placeholder="Email Address" required="required" /><br />

    <input type='text' name='liveagent.prechat:Phone' id='phone' placeholder="Phone" required="required" /><br />

    <textarea name='liveagent.prechat:Body' id='body' placeholder="Message" required="required" ></textarea><br />

    <input name="liveagent.prechat.buttons" value="{!buttonId}" type="hidden" /><br />

    <!-- Creates an auto-query for a matching Contact record’s Email field based on the
    value of the liveagent.prechat:Email field -->
    <input type="hidden" name="liveagent.prechat.query:Email" value="Contact,Contact.Email" />

    <!--- populate fields ---->
    <input type="hidden" name="liveagent.prechat.query:Email" value="Lead,Lead.Email" />
    <input type="hidden" name="liveagent.prechat.save:Email"  value="Provided_Email__c" />
    <input type="hidden" name="liveagent.prechat.save:name"   value="Provided_Name__c" />
    <input type='hidden' name='liveagent.prechat:ContactId'   value='' id='contactid'/>  
    <input type="hidden" name="liveagent.prechat.save:ContactId" value="Contact" />

    <input type='hidden' name='liveagent.prechat:LeadId' id='leadid' />  
    <input type="hidden" name="liveagent.prechat.save:LeadId" value="Lead" />

    <!--- Button that shows up if someone is online --->    
    <img   src="{!URLFOR($Resource.BeaconWebsite,'img/chatButton.png')}" 
           alt="Submit Form" 
           onclick="getLeadOrContact()"
           id='prechat_submit'
           style="display: none; border: 0px none; cursor: pointer; float: left;"/>

    <!--- Button that shows up if nobody is online --->
    <img id="liveagent_button_offline_{!buttonId}" 
         style="display: none; border: 0px none; cursor: pointer; float: left;" 
         src="{!URLFOR($Resource.BeaconWebsite,'img/chatButton.png')}" 
         onclick="alert('nobody online, put your redirect here');"/>

    </form>
</div>
</apex:page>

And the Apex Controller looks like this

global class PreChatController
{

    @remoteAction
    global static sobject findLeadOrContactByEmail(string name, string email, string phone)
    {
        sObject returnObject; //the id that will store the contact or lead this chat is related to
        //first we should see if there is a contact with this email
        list<contact> contacts = [select accountid, name, id, email from contact where email = :email limit 1];
        if(!contacts.isEmpty())
        {
            return contacts[0];
        }

        //if there is no contact, then lets look for a lead instead. Yeah we could combind both queries into a single SOSL search, but the code for that doesn't
        //end up being much cleaner when you account for having to figure out what record to return when you have a list of lists.
        else
        {
            list<lead> leads = [select name, id, email from lead where email = :email limit 1];
            if(!leads.isEmpty())
            {
                return leads[0];
            }
            else
            {            
                lead thisLead = new lead();
                string[] nameParts = name.split(' ');

                thisLead.firstname = nameParts.size() > 1 ? nameParts[0] : ''; //if name parts contains more than one element that means we likely got a full name and the first part is the firstname. Otherwise nothing
                thisLead.lastname =  nameParts.size() > 1 ? nameParts[1] : nameParts[0]; //if name parts is greater than 1 then use the 2nd element as the lastname. Otherwise use the first element
                thisLead.phone = phone;
                thisLead.email = email;
                thisLead.company = name;
                thisLead.leadSource = 'Web Site';

                insert thisLead;
                return thisLead;
            }
        }
    }
}
Chatone

The user is asked to enter their information to connect with an agent.

chatTwo

When the agent is alerted that there is a person waiting to chat they automatically get a ‘screen pop’ informing them of the contact details

chatThree

The full contact or lead record is available to the agent making all their persons information instantly available. Much nicer than having to ask them a million questions.

Now when a user uses your chat form they will have to fill in their email, name and phone. Using that we can either locate them if they already exist, or create them on the fly and pass that information along to the agent. Pretty slick eh? One small bug that I dislike and currently do not know how to fix is that even though the contactId/leadId is passed to the console those fields are not populated visually. So your agent doesn’t see the lookups populated, even though they are. I think for now it’s just a training point to tell them. When the chat is saved and closed the transcript will be related to the contact/lead you just can’t see it on this screen. Weird I know.

Anyway I hope this helps getting you started and maybe even finished implementing Live Agent. It’s pretty cool and has a lot of features/power…. way more than I would have guessed. Best of all it makes you look like some kind of development wizard when you get go from request to implementation in like a few hours, so now you can get back to reddit… until the next request comes.


Visualforce Force Download of PDF or Other Content

Hey everyone,

This next trick is one I’ve kind been keeping under my hat since it’s a nice polishing touch for some of my contest entries, but I figured I should probably share it with the world now (information must be free, etc). So we all know we can create Visualforce pages that render as PDF documents. It’s a pretty cool feature especially because business people love PDF files more than I love being a cynical ass (which is like… a lot). Though the one little annoyance is that normally when you create that PDF visualforce page the user is brought to it to view it where they then can download it. Many times they simply want to download it and attach it to an email or something, the viewing isn’t required and is generally just an extra few wasted seconds waiting for it to load so they can hit file->save as. I have found/built a nifty way to force download of the file using a combination of Apex and some tricky DOM manipulation. As an added bonus I’ll show you how to conditionally render the page as a PDF based on a URL param. Here we go!

The first thing we’ll need of course is our Visualforce page, we’ll keep it simple for this example. So here is our visualforce page

<apex:page controller="forceDownloadPDF" renderAs="{!renderAs}">
<h2>PDF Download example</h2>

<p>This is some content that could be displayed as a PDF or a regular web page depedning on the URL params. The valid URL params are as follows</p>
<table width="100%" cellpadding="5" cellspacing="5">
    <tr>
        <th>Name</th>
        <th>Type</th>
        <th>Default</th>
        <th>Required</th>
        <th>Description</th>
    </tr>
    <tr>
        <td>pdf</td>
        <td>String with a boolean value</td>
        <td>null/false</td>
        <td>false</td>
        <td>if passed in as a true the page will be rendered as a PDF. Otherwise displayed as HTML</td>
    </tr>
    <tr>
        <td>force_download</td>
        <td>String with a boolean value</td>
        <td>null/false</td>
        <td>false</td>
        <td>If true the user will be prompted to download the contents of the page. Suggested to be paired with pdf=true</td>
    </tr>
    <tr>
        <td>filename</td>
        <td>String (valid file name)</td>
        <td>'My PDF Report [todays date].pdf'</td>
        <td>false</td>
        <td>A name for the file. Only used if force_download=true</td>
    </tr>    
</table>

</apex:page>

And now our controller

public class forceDownloadPDF {

    public string renderAs{get;set;}

    public forceDownloadPDF()
    {

        //figure out if the user passed in the pdf url variable and if it is set to true.
        if(ApexPages.currentPage().getParameters().get('pdf') != null && ApexPages.currentPage().getParameters().get('pdf') == 'true') 
        {
            //if so, we are rendering this thing as a pdf. If there were other renderas options that were valid we could consider allowing the user to pass
            //in the actual renderAs type in the url, but as it stands the only options are pdf and null so no reason to allow the user to pass that in directly.
            renderAs = 'pdf';

            //figure out if we are forcing download or not.
            if(ApexPages.currentPage().getParameters().get('force_download') != null && ApexPages.currentPage().getParameters().get('force_download') == 'true') 
            {
                //setup a default file name
                string fileName = 'My PDF Report '+date.today()+'.pdf';

                //we can even get more created and allow the user to pass in a filename via the URL so it can be customized further
                if(apexPages.currentPage().getParameters().get('filename') != null)
                {
                    fileName = apexPages.currentPage().getParameters().get('filename') +'.pdf';
                }
                //here is were the magic happens. We have to set the content disposition as attachment.
                Apexpages.currentPage().getHeaders().put('content-disposition', 'attachemnt; filename='+fileName);
            }               
        }        
    }
}

As noted in the comments the real secret here is setting the content disposition use the Apex getHeaders method. Now you are saying,

‘But Kenji if I call that page from a link it still opens in a  new window it just forces the user to download the file. That’s not much better!’

Oh ye of little faith, of course I got you covered. You think I’d leave you with a half done solution like that? Hell no. Lets take this mutha to the next level. Here is what we are going to do. Using a custom button with onClick javascript we are going to create an iframe with the source set as that visualofrce page (with the force_download=true param) and inject it into the DOM. When the frame loads (which will have 0 width, and height so it’s not visible) that code still runs prompting the user to download the file. They are non the wiser that a frame got injected, all they see is a happy little download dialog prompt. So go create a custom button on an object that you want to prompt the user to download your file from. Make it a detail page button (you could do a list button to, but that’s a topic for another day). Make it onClick javascript. Then slap this code in there.

ifrm = document.createElement("IFRAME"); 
ifrm.setAttribute("src", "/apex/yourPage?pdf=true&force_download=true&filename=My Happy File"); 
ifrm.style.width = 0+"px"; 
ifrm.style.height = 0+"px"; 
document.body.appendChild(ifrm);

Of course replace the ‘yourPage’ with the name of your visualforce page. The filename of course can be changed to be details from the record, or whatever you like. Now when the user clicks that button the javascript creates an invisible iframe and injects it into the DOM. Once it loads the user is prompted to download the file. Pretty slick eh?

Hope you dig it. Catch ya next time.


Building a Better WordCloud

Hey all,

I know it’s been a while since my last post. Fun projects with cool end results have been rare and I’m not the type to post stuff just to fill space so I’ve kinda just been chillin recently. Though that changed when I was asked to take another whack at putting together a word cloud app, this time for Salesforce. You may or may not remember that I created a small word cloud app a while ago that didn’t have much anything to do with Salesforce and used a PHP back end that took free text replies from lime survey and created a real time word cloud. This time the task is a little different. I was asked to create a word cloud application that would take data from multiple records, and multiple fields, aggregate it all together and create a cloud from the resulting text blob. Real time updating was not requested, so I had some more flexibility in architecture. I also wanted to take this change to upgrade the actual display a bit since I wasn’t very pleased with my last attempt (it was functional, but a little clunky. Manual CSS rules specified formatting, etc).

Since I knew there must be a good word cloud generator out there I did a bit of searching and decided on the jQuery plugin ‘awesomeCloud’ which makes very stylish clouds without being overly complex. It’s licensing is pretty open (GPL v3) so it looked like a good fit. You can check out some of the sample word clouds in generates on the awesomeCloud demo page. It’s got some nice options for themeing, cloud shape, normalization and more.

So invocation is the word cloud is pretty easy, you just need to create a div with spans in it that have a data-weight attribute that specifies the frequency of that word. You then just call the plugin with the options you want on the container div. Like so

Example of creating a wordcloud (taken from the github page of awesomeCloud)

Example of creating a wordcloud (taken from the github page of awesomeCloud)

Easy enough? But now comes the tricky part, how do we get the data from the objects and fields count the word frequencies and then insert them into the DOM? That is where we come in. I decided that since this is going to be a pretty light weight application with most likely fairly small queries we could get away with using the ajax toolkit thus avoiding the extra complexities of Apex controllers. As usual I’m defaulting to using javascript where I can. You could of course modify the approach below to use Apex if it makes you happier. Whatever works for you. Anyway, lets get on with it.

I decided that since this code may be used in different places to do different things (maybe some inline pages, a stand alone VF page, a dashboard component maybe?), it made sense to make it into a component. Makes it easier to pass in configuration parameters as well, which I figured people would want to do. I’d recommend doing the same. First up lets go ahead of define the parameters we are going to allow to be passed into the component. Most of these are for configuring the word cloud itself, but there are a few others.

    <apex:attribute name="objectType" description="type of records to get data from" type="String" required="true"/>
    <apex:attribute name="records" description="records to get data from" type="String" required="false"/>
    <apex:attribute name="fields" description="fields on records to aggregate data from" type="String" required="true" default="Name"/> 

    <apex:attribute name="lowerbound" description="words below this frequency will not be dipslayed" type="String" required="false" default="0"/> 
    <apex:attribute name="skipwords" description="words not to display regardless of frequency" type="String" required="false" default="and,the,to,a,of,for,as,i,with,it,is,on,that,this,can,in,be,has,if"/> 

    <apex:attribute name="grid" description="word spacing; smaller is more tightly packed but takes longer" type="integer" required="false" default="8"/>    
    <apex:attribute name="factor" description="font resizing factor; default 0 means automatically fill the container" type="integer" required="false" default="0"/>
    <apex:attribute name="normalize" description="reduces outlier weights for a more attractive output" type="boolean" required="false" default="false"/> 

    <apex:attribute name="font" description=" font family, identical to CSS font-family attribute" type="string" required="false" default="Futura, Helvetica, sans-serif"/> 
    <apex:attribute name="shape" description="one of 'circle', 'square', 'diamond', 'triangle', 'triangle-forward', 'x', 'pentagon' or 'star'" type="string" required="false" default="circle"/> 

    <apex:attribute name="backgroundColor" description="background color" type="string" required="false" default="transparent"/>
    <apex:attribute name="colorTheme" description="dark or light" type="string" required="false" default="light"/>

    <apex:attribute name="width" description="how wide should the cloud be (css values such as a pixel or inch amount)?" type="string" required="false" default="600px"/>  
    <apex:attribute name="height" description="how tal; should the cloud be (css values such as a pixel or inch amount)?" type="string" required="false" default="400px"/>  
    <apex:attribute name="autoRefresh" description="should the wordcloud automatically refresh every so often?" type="boolean" required="false" default="false"/>

    <apex:attribute name="refreshInterval" description="how often shold the cloud refresh? In seconds" type="integer" required="false" default="5"/>

Of course you’ll need to include the required javascript libraries.

<script src="//code.jquery.com/jquery-latest.js"></script>
<script src="{!$Resource.jQueryWordCloud}"/>
<apex:includeScript value="/soap/ajax/15.0/connection.js"/>
<apex:includeScript value="/soap/ajax/15.0/apex.js"/>

Now we’ll need to start putting together the javascript functions to get the data and do the word frequnecy analysis. First up, lets create a javascript method that can build a dynamic query to get all the required data. Using the ajax toolkit it looks something like this

function getData(objectType,recordIds, fields)
{
    //build the basic SOQL query string
    var queryString = "Select "+fields+" from "+objectType;

    //if we are limiting the query by searching for specific object ids then add that condition
    if(recordIds.length > 0)
    {
        var whereStatment = "('" + recordIds.split("','") + "')";
        queryString += ' where id in ' + whereStatment;
    }
    //make sure to put a limit on there to stop big query errors.
    queryString += ' limit 2000';

    //run the query
    result = sforce.connection.query(queryString);

    //get the results
    records = result.getArray("records");

    //lets aggregate all the fetched data into one big string
    var wordArray = [];

    //we want to build an array that contains all the words in any of the requested fields. So we will
    //put all the desired fields into an array, iterate over all the records, then over all the fields
    //and stash all the data in another array.

    //fields is a comma separated string. Lets split it into an array so we can iterate over them easily 
    var fieldsArray = fields.split(',');

    //loop over all the records
    for (var i=0; i< records.length; i++)
    {
        var record = records[i];
        //loop over all the fields that we care about
        for(var j=0; j<fieldsArray.length; j++)
        {
            //get the value of this field from this record and add it to the array
            wordArray.push(record[fieldsArray[j]]);
        }
    }  
    //we will now pass in all the words from all the records fields into the getWordFrequnecy function. By calling join
    //on the array with a space character we get one big ass text chunk with all the words from all the records.

    var frequencyResult = getWordFrequency(wordArray.join(' '));

    //pass our frequnecy result data into the load cloud function
    loadCloud(frequencyResult); 
}

With that we can pass in an sObject type, an optional list of record ids (comma separated) and a list of fields to get data from on those records (also comma separated). Now we need to build the getWordFrequency function. That’s going to take a chunk of text and return an array of objects that contain a tag property, and a freq property. That data then gets fed into awesomeCloud to generate the cloud.

            //takes a string and counts the freqnecy of each word in it. Returns array of objects with tag and freq properties.
            //words should be space separated
            function getWordFrequency(wordString){

                //convert string to lower case, trims spaces, cleans up some special chars and splits it into an array using spaces and delimiter.
                var sWords = wordString.toLowerCase().trim().replace(/[,;.]/g,'').split(/[\s\/]+/g).sort();
                var iWordsCount = sWords.length; // count w/ duplicates

                // array of words to ignore
                var ignore = '{!skipwords}'.split(',');
                ignore = (function(){
                    var o = {}; // object prop checking > in array checking
                    var iCount = ignore.length;
                    for (var i=0;i<iCount;i++){
                        o[ignore[i]] = true;
                    }
                    return o;
                }());

                var counts = {}; // object for math
                for (var i=0; i<iWordsCount; i++) {
                    var sWord = sWords[i];
                    if (!ignore[sWord]) {
                        counts[sWord] = counts[sWord] || 0;
                        counts[sWord]++;
                    }
                }

                //get the lower bound as an integer. Lower bound controls the minimum frequnecy a word/tag can have for it to appear in the cloud
                var lowerBound = parseInt({!lowerbound},10);
                var arr = []; // an array of objects to return
                for (sWord in counts) {
                    if(counts[sWord] > lowerBound)
                    {
                        arr.push({
                            tag: sWord,
                            freq: counts[sWord]
                        });
                    }
                }

                /* Sorting code, not really required for this purpose but kept in case it is decided that we want it for some reason.
                // sort array by descending frequency | http://stackoverflow.com/a/8837505
                return arr.sort(function(a,b){
                    return (a.freq > b.freq) ? -1 : ((a.freq < b.freq) ? 1 : 0);
                });
                */

                return arr;

            }

Alright, so now we have all the data from all the records analyzed and the frequency of each word is known. Words we have decided to skip have been left out and those that don’t make the cut from the lower bound are also excluded hopefully leaving us only with actually interesting words. Now we have to pass in the that data to awesomeCloud along with our settings.

function loadCloud(data) 
{ 
    //wordcloud settings
    var settings = {
            size : {
                grid : {!grid},
                factor : {!factor},
                normalize: {!normalize}
            },
            color : {
                background: "{!backgroundColor}"
            },
            options : {
                color : "random-{!colorTheme}",
                rotationRatio : 0.5
            },
            font : "{!font}",
            shape : "{!shape}"
    } 

    //create array for tag spans
    var wordStringArray = [];

    //evaluate each array element, create a span with the data from the object
    $.each(data, function(i, val)
    {
        wordStringArray.push('<span data-weight="'+val.freq+'">'+val.tag+'</span>');
    });

    //join all the array elements into a string. I've heard to push stuff into array and join then push to the DOM is faster than
    //than modifying a string (since strings are unmutable) or modifying the DOM a ton, which makes sense.
    $( "#wordcloud" ).html(wordStringArray.join(''));

    //setup our word cloud
    $( "#wordcloud" ).awesomeCloud( settings );
}

Alright, so now we just need to invoke all this these crazy functions. We’ll do that with a functional call in the document onready to make sure everything is loaded before we start going all crazy modifying the DOM and such.

//login to salesforce API so we can run query
sforce.connection.sessionId = '{!$Api.Session_ID}';
$(document).ready(function() {
    //make an immediate call to getData with the info we need from the config params.
    getData('{!objectType}', '{!records}', '{!fields}');

    //if we are doing an auto refresh, rig that up now using the setInterval method
    if({!autoRefresh})
    {
        setInterval ( "getData('{!objectType}', '{!records}', '{!fields}')", parseInt({!refreshInterval},10) * 1000 );
    }
});

Finally we just need to create our HTML container for the wordcloud and setup the CSS style.

    <style>

    .wordcloud {
        /*border: 1px solid #036;*/
        height: {!height};
        margin: 0.5in auto;
        padding: 0;
        page-break-after: always;
        page-break-inside: avoid;
        width: {!width};

    }

    </style>
    <div id="container" title="wordcloud with the content of fields {!fields} for objects {!records}"> 

        <div id="wordcloud" class="wordcloud" ></div>
    </div>

Whew, alright so that’s it for our component. Now we just need a visualforce page to invoke it. That part is easy. Although our word cloud is capable of aggreating data from multiple objects, for this demo we’ll keep it simple. We will create a little inline visualforce page that can live on the account object. It will create a wordcloud from the description, name, type and ownership fields. To do that, create a visualforce page like this

<apex:page standardController="account">
    <!--- 
    WordCloud comes as a component that can be invoked from any visualforce page. You must pass it the object type to build the cloud for. The rest is optional.
    objectType: the type of sObject to get data from to power the word cloud
    fields: the records on the objects who's content will be used to create the cloud. Must be comma separated
    records: a list of ids which to query for. If none is provided all records of the objectType are queried
    skipwords: words that will not be included in the word cloud no matter how many times they appear. 
    lowerbound: the minimum number of times a word must appear in the text before it is displayed in the cloud.
    grid: word spacing; smaller is more tightly packed but takes longer
    factor: font resizing factor; default "0" means automatically fill the container
    normalize: reduces outlier weights for a more attractive output
    shape: shape of the cloud. Must be one of "circle", "square", "diamond", "triangle", "triangle-forward", "x", "pentagon" or "star"
    font: font family, identical to CSS font-family attribute
    width: width of the cloud. Can be a percent or pixel/inch amount.
    height: height of the cloud. Must be a pixel or inch amount.
    backgroundColor: a hexidecimal (#000000) or string color value. Use 'transparent' for no background
    colorTheme: theme for word colors. Use 'dark' or 'light'
    autoRefresh: automatically refresh the cloud after a specified interval?
    refreshInterval: interval (in seconds) after which the cloud is automatically refreshed
    --->
    <c:wordCloud objectType="account" 
                 records="{!account.id}" 
                 fields="Name,Type,Ownership,Description"
                 lowerbound="1"
                 skipwords="and,an,any,so,or,are,the,to,a,of,for,as,i,with,it,is,on,that,this,can,in,be,has,if"
                 grid="8"
                 factor="0"
                 normalize="false"
                 font="Futura, Helvetica, sans-serif"
                 shape="triangle"
                 width="100%"
                 height="400px" 
                 backgroundColor="black"
                 colorTheme="light" 
                 autoRefresh="false" 
                 refreshInterval="15" />
</apex:page>

Save that page and add it to the page layout of the account. Put a nice big blob of text in the description of an account and see what happens. For my example I figured it would be fitting to copy and paste text from the frequency analysis page of wikipedia. The result looks like this.

Result of WordCloud

Pretty cool eh? Anyway this was just something I threw together in an hour or two, hopefully it’s useful to someone out there. Let me know what ya think!


Dynamic PDF Generator

I recently had a requirment where a list of any kind of sObject could be given to a visualforce page, and it should spit out a PDF report of those objects. The fields returned could possibly be defined by a field set, passed in the URL directly, or I could get passed nothing and would just have to query for all fields on the object. It was decided that the best course of action was to write a nice re-useable apex class that can handle these requirements and use the visualforce renderas attribute to make it easy to generate printable reports. You can easily rig up a custom button on a list view to get the checked elements and pass them into the exporter page as well to basically allow exporting from any list view. The following is the first draft of said functionality.

4/03/2013 EDIT: Thanks to a good tip by Cal Smith I changed how the visualforce page outputs the content and it seems to be much faster and probably safer too. I also included two new params for the exporter. You can now provided a field to order by by specifying order_by in the url. Also, if you want the records returned in the same order the ids were provided in the url you can specify return_in_order=true. This is probably slow on large data sets but in cases where your users may have put records in the order they want you can then pass the ids in that order to the controller and the PDF will be generated with the same order. Kinda a nice feature I thought.

4/05/2013 EDIT: I added the force_download param and filename params to allow you to force the user to download the file and specify a name for the downloaded file instead of letting them view it in their browser. Not totally sure why someone might want this, but it was a request I got and it was fairly easy to add.

The Apex Class

/*
Name: queryGenerator
Author: Daniel Llewellyn
Date: 4/02/2013
Description: Meant to be invoked by a visualforce page. This class can take url params to query for a list of any kind of
             sObject. Those sObjects can then be used to power user interface elements. By passing in a list of Ids and an
             option field set, this class is able to determine the object type, find a matching field set, use a default field set
             or if none is specified query for all fields defined on the object. Useful for generating lists of sObjects when the type
             of object, and desired fields is not known ahead of time.

URL Params:
name            type         req      description
--------------------------------------------------------------------------------------------------------------------
ids:            csv of ids   true     a list of sObject Ids seperated by commas. The objects to include in the return query
fields:         string       false    a comma seperated list of fields to include. Takes precidense over fieldSet if specified.
fieldSet:       string       false    the name of a fieldset to use to determine which fields to include. Used if fields param not specified. If both are null, all fields are queried.
order_by        string       false    the name of a field on the object to order the results by
return_in_order boolean      false    should the results be returned in the same order they were provided in the URL? Overrides the order_by param if set to true.
force_download  boolean      false    should the PDF file be forced to donwload instead of displayed in the browser window?
filename        string       false    the name to assign to the downloaded file if force_download is set to true. Defaults to object label + ' report.pdf' Do not include .pdf. It is appened automatically.

Gotchas:
Due to the way the query is built (filtering by a list of Ids) you can only get probably about 500 records max before the query length gets too long.
Shouldn't be a big deal though, a report of more than 500 records starts to get kind of meaningless most of the time. It will attempt to gracefully handle
any errors and return them nicely to the user.

*/

public class queryGenerator
{
    //Params. Can be used in your visualforce page to customize report data.
    public Schema.SObjectType OBJECT_SCHEMA_TYPE{get;set;}
    public string OBJECT_TYPE_NAME{get;set;}
    public string OBJECT_TYPE_LABEL{get;set;}
    public string ORDER_BY{get;set;}
    public boolean RETURN_IN_ORDER{get;set;}
    public list<string> OBJECT_FIELDS{get;set;}
    public map<string,string> OBJECT_FIELD_MAP{get;set;}
    public list<id> OBJECT_IDS{get;set;}
    public list<sobject> OBJECTS{get;set;}
    public integer RECORD_COUNT{get { 
        return objects.size();
    }set;}

    public queryGenerator(){

        try
        {

            OBJECT_FIELD_MAP = new map<string,string>();

            //get the list of ids to query for. We expect them to come in a url param called ids, and they should be
            //comma seperated. Since we know that, we can split them based on , to get a list of ids.
            if(ApexPages.currentPage().getParameters().get('ids') == null)
            {
                throw new applicationException('Please include a list of a comma seperated ids to query for in the url by specifying ?ids=id1,id2,id3 etc');
            }
            OBJECT_IDS = ApexPages.currentPage().getParameters().get('ids').split(',');

            //use the ids getSObjecType method to figure out what kind of objects these are we got passed. 
            OBJECT_SCHEMA_TYPE = OBJECT_IDS[0].getSObjectType(); 

            //caching describe results makes for faster iteration
            map<string,Schema.sObjectField> fieldMap = OBJECT_SCHEMA_TYPE.getDescribe().fields.getMap();

            for(Schema.SObjectField field : fieldMap.values())
            {
                OBJECT_FIELD_MAP.put(field.getDescribe().getName(),field.getDescribe().getLabel());
            }

            //get the name of this object type
            OBJECT_TYPE_NAME = OBJECT_SCHEMA_TYPE.getDescribe().getName();

            OBJECT_TYPE_LABEL = OBJECT_SCHEMA_TYPE.getDescribe().getLabel();

            //get the list of fields we will query for and display
            if(ApexPages.currentPage().getParameters().get('fields') == null)
            {
                OBJECT_FIELDS = getObjectQueryFields(OBJECT_SCHEMA_TYPE, ApexPages.currentPage().getParameters().get('fieldset'));    
            }
            else
            {
                OBJECT_FIELDS = ApexPages.currentPage().getParameters().get('fields').split(',');
            }
            //set the order by statment. If no order by is specified, just tell it to order by Id to prevent a syntax error
            if(ApexPages.currentPage().getParameters().get('order_by') != null)
            {
                ORDER_BY= ApexPages.currentPage().getParameters().get('order_by');   
            }
            else
            {
                ORDER_BY = 'Id';
            }            

            RETURN_IN_ORDER = false;
            if(ApexPages.currentPage().getParameters().get('return_in_order') != null && ApexPages.currentPage().getParameters().get('return_in_order') == 'true')
            {
                RETURN_IN_ORDER = true;   
            }       

            OBJECTS = getSojects();   

            if(ApexPages.currentPage().getParameters().get('force_download') != null && ApexPages.currentPage().getParameters().get('force_download') == 'true') 
            {
                string fileName = 'Report of '+OBJECT_TYPE_LABEL+'.pdf';
                if(apexPages.currentPage().getParameters().get('filename') != null)
                {
                    fileName = apexPages.currentPage().getParameters().get('filename') +'.pdf';
                }
                Apexpages.currentPage().getHeaders().put('content-disposition', 'attachemnt; filename='+fileName);
            }    

        }
        catch(exception ex)
        {
            //catch and return errors. Most often will happen from a bad Id of fieldname being passed in.
            system.debug('\n\n\n------Error occured during page init!');
            system.debug(ex.getMessage() + ' on line ' + ex.getLineNumber());
            ApexPages.addmessage(new ApexPages.message(ApexPages.severity.WARNING,ex.getMessage() + ' on line ' + ex.getLineNumber()));

        }
    } 
    //this method will be invoked by a visualforce page. It will determine the sObject
    //type by examining the Ids passed in the ids param. Once it knows the object type it will
    //then attempt to find a locate the specified fieldset if one was passed in the URL. If no fieldset
    //was provided, then it will query for all sObject fields.

    public list<sobject> getSojects()
    {
        list<sobject> queryResults;

        //lets get a list of fields to query for by using the getObjectQueryFields method. We will pass in the object type
        //and the fieldset url param (which may be null, but that doesnt matter).
        string queryFields = listToCsv(OBJECT_FIELDS);

        //build this query string
        string queryString = 'select ' + queryFields + ' from ' + OBJECT_TYPE_NAME + ' where id in :OBJECT_IDS ORDER BY '+ORDER_BY;

        if(queryString.length() > 10000)
        {
            throw new applicationException('Query too long ('+queryString.length()+'). Please reduce the number of ids or reduce the number of fields queried for to get the length under 10,000');
        }
        //run the query.
        queryResults = database.query(queryString);

        if(RETURN_IN_ORDER)
        {
            queryResults = sortQueryInOrder(OBJECT_IDS, queryResults);
        }
        return queryResults;
    }

    //takes the list of sObjects and sorts them in the order they were passed in the URL. This allows for a custom sorting order to be passed in
    //without having to make use of the SOQL order by clause which may not be robust enough to handle the types of sorts desired.
    //WARNING THIS IS PROBABLY PRETTY DAMN SLOW!
    public list<sObject> sortQueryInOrder(list<id> objectOrder, list<sObject> objects)
    {
        map<id,sObject> objectMap = new map<id,sObject>();
        list<sObject> sortedList = new list<sObject>();
        for(sObject obj : objects)
        {
            objectMap.put((id) obj.get('id'), obj);
        }

        for(id objId : objectOrder)
        {
            sortedList.add(objectMap.get(objId));
        }
        return sortedList;

    }
    //takes an sObject type and optional name of a fieldset for that sObject type (can be null). Returns a list
    //of strings of fields to query for either based on the fieldset, or by finding all sObject fields if no fieldSet
    //is specified, or a matching fieldSet can not be found.
    public list<string> getObjectQueryFields(Schema.SObjectType objectType, string fieldSetName)
    {
        set<string> fields = new set<string>();
        Schema.FieldSet thisFieldSet = null;

        //first any fieldsets that are defined for this object type. It is possible this might be empty.
        Map<String, Schema.FieldSet> fieldSetMap = objectType.getDescribe().fieldSets.getMap();  

        //check to see if the user passed in a field set, and if so, does it exist? 
        //if so, use that fieldset. Otherwise, use all fields on the object
        if(fieldSetName != null && fieldSetMap.containsKey(fieldSetName))
        {
            thisFieldSet = fieldSetMap.get(fieldSetName);
            //now that we know what field set we are using we have to iterate over it and get it feildsetmembers
            //and add each field into the query string.
            for(Schema.FieldSetMember f : thisFieldSet.getFields())
            {
                fields.add(f.getFieldPath());
            }            
        }             

        //if there are no field sets defined for this object, then lets just query for all the fields
        else
        {
            fields = getObjectFields(objectType);            
        }

        //return our variable that contains a properly comma seperated list of all the fields to query for.
        list<string> fieldList = new list<string>();
        fieldList.addAll(fields);
        return fieldList;
    }

    //a simple possibly overly abstracted method to get the fields on an object
    public set<string> getObjectFields(Schema.SObjectType objectType)
    {
        return objectType.getDescribe().fields.getMap().keySet();
    }

    //takes a list of strings and returns them in a comma seperated fashion, suitable for feeding into a query.
    public string listToCsv(list<string> stringList)
    {
        string itemList = '';
        for(string thisString : stringList)
        {
            itemList += thisString+',';
        }
        itemList=itemList.substring(0,itemList.length()-1);
        return itemList;
    }

    @isTest
    public static void testQueryGenerator()
    {
        //setup our test account
        Account testAccount = new Account();
        testAccount.name = 'My Test account';
        testAccount.billingStreet = '1234 Test Street';
        testAccount.billingState = 'NY';
        testAccount.billingPostalCode = '55555';
        testAccount.billingCountry = 'USA';

        insert testAccount;

        test.StartTest();

        PageReference pageRef = Page.exportPdf;
        Test.setCurrentPage(pageRef);

        //run it with no ids. It will come back with no records since it will error. Since the error gets caught
        //we don't need to try/catch here though.
        queryGenerator qg = new queryGenerator();

        //run test with nothing but ids specified. This will make it query for all fields
        ApexPages.currentPage().getParameters().put('ids', testAccount.id);        
        qg = new queryGenerator();

        //make sure it found our account
        system.assertEquals(1,qg.RECORD_COUNT);
        system.assertEquals(testAccount.name,(string) qg.OBJECTS[0].get('name'));

        ApexPages.currentPage().getParameters().put('fields', 'name,id,billingStreet');        
        qg = new queryGenerator();        
        //make sure it found our account
        system.assertEquals(1,qg.RECORD_COUNT);
        system.assertEquals(testAccount.billingStreet,(string) qg.OBJECTS[0].get('billingStreet'));

        ApexPages.currentPage().getParameters().put('order_by', 'name'); 
        ApexPages.currentPage().getParameters().put('return_in_order', 'true'); 
        ApexPages.currentPage().getParameters().put('force_download', 'true');
        ApexPages.currentPage().getParameters().put('filename', 'My PDF file');
        qg = new queryGenerator();       

    }
    class applicationException extends Exception {}
}

The ExportPDF visualforce Page

<apex:page controller="queryGenerator" renderAs="pdf"  standardStylesheets="false">
<head>
  <style>
    @page {
        size:landscape;
        margin : .5in;
        @top-center {
            content : element(header);
         }

        @bottom-left {
            content : element(footer);
        }

    }
    table
    {
        width:100%;
    }
    @bottom-left {
        content : element(footer);
    }
    div.footer {
        position : running(footer) ;
    }    
  </style> 
</head>
    <apex:pageMessages></apex:pageMessages>
    <h1>Report of {!OBJECT_TYPE_LABEL} ({!RECORD_COUNT} Records)</h1>

    <table>
        <tr>
            <apex:repeat value="{!OBJECT_FIELDS}" var="FieldLable">
                <apex:outputText><th>{!OBJECT_FIELD_MAP[FieldLable]}</th></apex:outputText>
            </apex:repeat>        
        </tr>

        <apex:repeat value="{!OBJECTS}" var="rec">
            <tr>
                <apex:repeat value="{!OBJECT_FIELDS}" var="FieldLable">
                    <apex:outputText><td>{!rec[FieldLable]}</td></apex:outputText>
                </apex:repeat>
            </tr>
        </apex:repeat>
    </table>

    <div class="footer">
    <apex:outputText value="The Date: {0,date,MMMMM dd, yyyy 'at' hh:mm a}" styleClass="footer" >
        <apex:param value="{!NOW()}" />
    </apex:outputText> 
    </div>    
</apex:page>

Sample List View Button

window.open('/apex/exportPdf?ids='+ {!GETRECORDIDS($ObjectType.YOUR_OBJECT_TYPE)}+'&fieldset=YOUR_FIELD_SET_NAME_HERE&order_by=name&return_in_order=false','1364931211178','width=700,height=500,toolbar=0,menubar=0,location=0,status=1,scrollbars=1,resizable=1,left=0,top=0')

You’ll need to replace the $ObjectType.YOUR_OBJECT_TYPE and the fieldset=YOUR_FIELD_SET_NAME_HERE in the list view button. Or you can just remove the fieldset part entirly, or replace it with a ‘fields’ attribute where you can specify a comma separated list of fields to query for. You’ll probably want to play with the formatting of the report a little but, but I’ll leave that as an exercise to the reader. Hopefully this helps someone out there.


Salesforce Dashboard Automatic Refresh Bookmarklet

Hey all,

Quick fun little chunk of code here for you. This code when saved as a bookmarklet (javascript saved as a bookmark which runs on the current page when clicked) will cause Salesforce dashboards to automatically refresh every X seconds, where X is a variable near the top of the code (defaults to 90 seconds). It also injects a little timer on the refresh button, and is smart enough to wait for the dashboards to refresh before it continues the next countdown. I haven’t cross browser tested it yet (built in Chrome 25) but as long as the browser supports the DOMSubtreeModified event listener you are probably fine. Just save the code as a bookmarklet, navigate to your dashboard page and click the bookmarklet. You should see a small timer show up on the refresh button. When the timer hits 0 the dashboard should refresh, and the timer will reset back to the default time and being counting down again.

javascript:(
    function() 
    {
        var refreshInterval = 90; //number of seconds between each refresh
        var counter = refreshInterval;
        var timerInterval;
        var button = document.getElementById('refreshInput');
        if(button == null)
        {
            alert('Refresh Button not found! Salesforce may have changed the buttons ID or it may not be visiable for some reason. Please make sure you are on a dashboard page with the Refresh button visible');
            return false;
        }

        document.addEventListener("DOMSubtreeModified", function() {
            if(event.target.id == "componentContentArea")
            {
                startTimer();
            }
        }, true);

        function countDown(){
            counter--;
            button.value = "Refresh ("+formatTime(counter)+")";
            if(counter == 0)
            {
                button.click();
                counter = refreshInterval;
                window.clearInterval(timerInterval);            
                button.value = "Waiting for Refresh";
            }                
        }

        function startTimer()
        {
            window.clearInterval(timerInterval);
            timerInterval = setInterval(countDown, 1000);     
        }    

        function formatTime(seconds)
        {
            var totalSec = seconds;
            hours = parseInt( totalSec / 3600 ) % 24;
            minutes = parseInt( totalSec / 60 ) % 60;
            seconds = totalSec % 60;

            result = (hours < 10 ? "0" + hours : hours) + ":" + (minutes < 10 ? "0" + minutes : minutes) + ":" + (seconds  < 10 ? "0" + seconds : seconds);            

            return result;
        }
        startTimer(); 
    }
)();

Lets Build a Tree (From Salesforce.com Data Categories)

Salesforce Data categories. If you’ve had to code around then on the Salesforce.com platform, you are probably aware of the complexity, and how much of a pain they can be. If you havn’t worked with them much, you are fortunate 😛 They are essentially a way to provide categories for any sObject in Salesforce. They are most frequently used with knowledge articles. The Apex calls, describes and schema for them is unlike anything else in the Salesforce schema. Categories are their own objects and they can be nested to infinite complexity. In short, they are complicated and take a while to really get your head around them (I still don’t know if I really do). Thankfully I’ve done a bunch of hard work and discovery so that you don’t have to. For this particular project, we are going to build a nifty tree style selector that allows a user to select any data category for a given sObject type. You can then do whatever you want with that info. Yes I know there are some built in visualforce components for handling data categories, but they aren’t super flexible and this is just a good leaning experience. In the end, you’ll have an interactive tree that might look something like this.

treeDemoWord of Warning: I had to live modify some of code I posted below to remove sensitive information that existed in the source project. I haven’t used the EXACT code below, but very very close. So please let me know if something doesn’t quite work and I’ll try to fix up the code in the post here. The idea works, it’s solid, but there might be a rough syntax error or something.

Our application is going to consist of a visualforce page that displays the tree. A component that contains the reusable tree code, a static resource that contains the javascript libraries, css file and images for the tree structure. Of course we will also have an apex class that will handle some of the heavy lifting of getting category data, and returning it to our visualforce page. We’ll use javascript/apex remoting to communicate with that Apex class. First off, lets grab the static resource and get that uploaded into your org. You can snag it here

https://www.box.com/s/04u0cd8xjtm0z84tbhid

upload that, make it public, call it jsTree. Next we’ll need our Apex class. It looks like this.

global class CaseSlaController
{
    //constructors for component and visualforce page extension
    public CaseSlaController() {}
    public CaseSlaController(ApexPages.StandardController controller) {}

    //gets category data and returns in JSON format for visualforce pages. Beware that since we end up double JSON encoding the return 
    //(once from the JSON.serialize, and another time because that's how data is returned when moved over apex remoting) you have to fix
    //the data on the client side. We have to double encode it because the built in JSON encoder breaks down when trying to serialize
    //the Schema.DescribeDataCategoryGroupStructureResult object, but the explicit call works.
    @remoteAction 
    global static string getCategoriesJson(string sObjectType)
    {
        return JSON.serialize(CaseSlaController.getCategories(sObjectType));
    }

    public static  list<Schema.DescribeDataCategoryGroupStructureResult> getCategories(string sObjectType)
    {

        //the describing of categories requires pairs of sObject type, and category name. This holds a list of those pairs.
        list<Schema.DataCategoryGroupSObjectTypePair> pairs = new list<Schema.DataCategoryGroupSObjectTypePair>();

        //list of objects to describe, for this app we only take 1 sObject type at a time, as passed into this function.
        list<string> objects = new list<string>();
        objects.add(sObjectType);

        //describe the categories for this object type (knowledgeArticleVersion)
        List<Schema.DescribeDataCategoryGroupResult> describeCategoryResult =  Schema.describeDataCategoryGroups(objects);

        //add the found categories to the list.
        for(Schema.DescribeDataCategoryGroupResult s : describeCategoryResult)
        {
            Schema.DataCategoryGroupSObjectTypePair thisPair = new Schema.DataCategoryGroupSObjectTypePair();
            thisPair.sObject = sObjectType;
            thisPair.dataCategoryGroupName = s.getName();
            pairs.add(thisPair);            
        }

        //describe the categories recursivly
        list<Schema.DescribeDataCategoryGroupStructureResult> results = Schema.describeDataCategoryGroupStructures(pairs,false);

        return results;
    }    
    private static DataCategory[] getAllCategories(DataCategory [] categories)
    {
        if(categories.isEmpty())
        {
            return new DataCategory[]{};
        } 
        else
        {
            DataCategory [] categoriesClone = categories.clone();
            DataCategory category = categoriesClone[0];
            DataCategory[] allCategories = new DataCategory[]{category};
            categoriesClone.remove(0);
            categoriesClone.addAll(category.getChildCategories());
            allCategories.addAll(getAllCategories(categoriesClone));
            return allCategories;
        }
    }
}

So there are three functions there and two constructors. The constructors are for later on when we use this thing in a component and a visualforce page, so don’t really worry about them. Next is the getCategoriesJson, that is the remote function we will call with our javascript to get the category data. It just invokes the getCategories function since that returns an object type that Salesforce can’t serialize with it’s automatic JSON serializer without blowing up (in my real app I had to use getCategories for another reason, hence why I didn’t just combine the two functions into one that always returns JSON). The last one is just a private function for spidering the data category description. Other than that, you can check out the comments to figure out a bit more about what it’s doing. In short it describes the categories for the given sObject type. It then creates dataCategoryGroupSobjectTypePairs from those categories and describes those and returns the huge complicated chunk.

Alright, so we got the back end setup, let’s actually make it do something. For that we need our component and visualforce page. First up, the component. Wrapping this picker in a component makes it easy to use on lots of different visualforce pages. It’s not required but it’s probably a better design practice.

<apex:component Controller="CaseSlaController">
    <!---- Two parameters can be passed into this component ---->
    <apex:attribute name="sObjectType" type="string" description="the sObject type to get data category tree for" />
    <apex:attribute name="callback" type="string" description="Name of javascript function to call when tree drawing is complete" />

    <!--- include the required libraries --->
    <link rel="stylesheet" href="{!URLFOR($Resource.jsTree, 'css/jquery.treeview.css')}" />
    <apex:includeScript value="{!URLFOR($Resource.jsTree, 'js/jquery.min.js')}" />
    <apex:includeScript value="{!URLFOR($Resource.jsTree, 'js/jquery.treeview.js')}" />

    <script>
        //put jQuery in no conflict mode
        j$=jQuery.noConflict();     

        //object to hold all our functions and variables, keep things organized and dont pollute the heap
        var categorySelect = new Object();

        //invokes the getCategoriesJson function on the apex controller. Returns to the callback function with the
        //fetched data
        categorySelect.getCategoryData = function(sObjectType,callback)
        {
            Visualforce.remoting.Manager.invokeAction(
                '{!$RemoteAction.CaseSlaController.getCategoriesJson}', 
                sObjectType,
                function(result, event){
                   callback(result,event);
                }, 
                {escape: true}
            );          
        }    

        //as soon as the dom has loaded lets get to work
        j$(document).ready(function() {

            //first off, find all the data category data for the given sObject type.       
            categorySelect.getCategoryData('{!sObjectType}',function(result,event)
            {
                //the json data we get back is all screwed up. Since it got JSON encoded twice quotes become the html
                //&quote; and such. So we fix the JSON and reparse it. I know its kind of hacky but I dont know of a better way

                var fixedJson = JSON.parse(categorySelect.htmlDecode(result));         

                //lets create the series of nested lists required for our tree plugin from the json data.
                var html = categorySelect.buildTreeHtml(fixedJson);                          

                //write the content into the dom
                j$('#categoryTree').html(html);              

                //apply the treeview plugin
                j$("#categoryTree").treeview({
                    persist: "location",
                    collapsed: true,
                    unique: true
                });  

                //if the string that was passed in for callback is actually representative of a function, then call it
                //and pass it the categoryTree html.
                if(typeof({!callback}) == "function")
                {
                    {!callback}(j$("#categoryTree"));                                               
                }
            });    
        });

        //function that is meant to be called recursivly to build tree structure html
        categorySelect.buildTreeHtml = function(category)
        {
            var html = '';     

            //iterate over the category data  
            j$.each(category,function(index,value)
            {
                //create list item for this item.
                html+='<li><a href="#" category="'+value.name+'" class="dataCategoryLink" title="Attach '+value.label+' SLA to Case">'+value.label+'</a>';

                //check to see if this item has any topCategories to iterate over. If so, pass them into this function again after creatining a container               
                if(value.hasOwnProperty('topCategories') && value.topCategories.length > 0)
                {
                    html += '<ul>';
                    html += categorySelect.buildTreeHtml(value.topCategories);                    
                    html +='</ul>';                 
                }   
                //check to see if this item has any childCategories to iterate over. If so, pass them into this function again after creatining a container                           
                else if(value.hasOwnProperty('childCategories')  && value.childCategories.length > 0)
                {
                    html+='<ul>';                   
                    html += categorySelect.buildTreeHtml(value.childCategories);
                    html+='</ul>';
                }
                html += '</li>';
            });
            return html;                
        }

        //fixes the double encoded JSON by replacing html entities with their actual symbol equivilents
        //ex: &quote; becomes "
        categorySelect.htmlDecode = function(value) 
        {
            if (value) 
            {
                return j$('<div />').html(value).text();
            } 
            else
            {
                return '';
            }
        }            
    </script>
    <div id="categoryTreeContainer">
        <ul id="categoryTree">

        </ul>
    </div>
</apex:component>

Now then finally we need a visualforce page to invoke our component and rig up our tree items to actually do something when you click them. We wanted to keep the component simple, just make the interactive tree cause different pages might want it to do different things. That is where that included callback function comes in handy. The visualforce page can invoke the component and specify a callback function to call once the component has finished its work so we know we can start manipulating the tree. Our page might look like this.

<apex:page sidebar="false" standardController="Case" showHeader="false" extensions="CaseSlaController">
    <c:categorySelect callback="knowledgePicker.bindTreeClicks" sObjectType="KnowledgeArticleVersion"/>

    <script>           
        var knowledgePicker = new Object();

        knowledgePicker.bindTreeClicks = function(tree)
        {
            j$('.dataCategoryLink').click(function(event,ui){
                event.preventDefault();
                alert('clicked ' + j$(this).attr('category'));
            }); 
        }                           
    </script>   
</apex:page>

We invoke the component passing it a callback function name and the type of sObject we want to make the category tree of. We then create a function with the same name as the callback. Inside that function we simple attach an onclick event handler to the tree category links that sends us an alert of which one the user clicked. Of course we could then do anything we wanted, make another remoting call, update an object, whatever.

Anyway, I hope this was helpful. I know I was a bit frustrated as the lack of sample code for dealing with categories so hopefully this helps some other developers out there who might be trying to do the same kind of thing. Till next time!

-Kenji/Dan


One door closes, another one opens

Hey everyone,

As some of you may be aware I have recently accepted a new position as senior developer at RedKite technologies. They are a consulting firm specializing at implementation and custom development of Salesforce, mostly for financial organizations (but not exclusively). While I am extremely excited for this new opportunity to work with an awesome team and continue to grow my skills, it does mean that I will no longer be able to do freelance work (it could be taken as a conflict of interests kind of thing, you understand). So as of now, I am sorry but I have to decline any offers for freelance work, at least until the smoke clears and some details are figured out.

The good news is, that if you would like to leverage my skills and those of some other very talented developers working with me, you can! RedKite is happy to evaluate any Salesforce project and if you ask you may be able to get me tasked on your project. RedKite has an excellent track record, is growing very rapidly and you are sure to be happy with the results of any project you engage us on. I wouldn’t be working there if it wasn’t comprised of some of the most talented and passionate people in the industry. I am also still available to answer questions, give advice, etc I just don’t think I can accept money or undertake entire projects on the side at this point. Thanks for understanding, and I hope we can still do business, if perhaps through a slightly more official channel 😛

-Dan/Kenji


Publicly Hosted Apex REST Class bug (maybe?)

I seem to have run across an odd bug. Custom Apex REST classes hosted via a Salesforce site will not work in a production version. It does work in sandbox and developer versions, so I am fairly convinced the approach is valid and my config is correct. This is a sample class.

@RestResource(urlMapping='/testPublicRest')
global class testPublicRest {
@HttpGet
global static String doGet() {
String name = RestContext.request.params.get('name');
return 'Hello '+name;
}

@isTest
global static void testRespondentPortal()
{
// set up the request object
System.RestContext.request = new RestRequest();
System.RestContext.response = new RestResponse();

//First lets try and create a contact.
RestContext.request.requestURI = '/testservice';
RestContext.request.params.put('name','test');
//send the request
testPublicRest.doGet();
}
}

Sandbox version sans namespace – Works
https://fpitesters.testbed.cs7.force.com/webServices/services/apexrest/testPublicRest?name=dan

Developer version with namespace – Works
https://xerointeractive-developer-edition.na9.force.com/partyForce/services/apexrest/XeroInteractive/testPublicRest?name=dan

Production version sans namespace – Fails
https://fpitesters.secure.force.com/webServices/services/apexrest/testPublicRest?name=dan

It fails saying that it cannot find a resource with that name.

<Errors>
<Error>
<errorCode>NOT_FOUND</errorCode>
<message>Could not find a match for URL /testPublicRest</message>
</Error>
</Errors>

If you attempt to access it via the non secure domain you will get an HTTPS required message, so the resource is at least being located. It throws this error, which makes sense.

<Errors>
<Error>
<errorCode>UNSUPPORTED_CLIENT</errorCode>
<message>HTTPS Required</message>
</Error>
</Errors>

Seems like I found a bug maybe? To test yourself just copy and paste the above code. Host it via a salesforce site. Access it in your sandbox it should work (remember to access it via https. To get to a REST service just include /services/apexrest/yourService at the end of your site url. Then try deploying it to prod and doing the same. It will most likely fail.

I’d love to hear any feedback/ideas on this, as it’s a fairly critical part of a framework I am developing. Thanks!

Also if you do have any info, make sure to post it on the stack exchange. That’s probably the best place for this kind of thing.
http://salesforce.stackexchange.com/questions/6122/custom-rest-service-https-error

UPDATE: Got it figured out. It was due to a permissions error on the guest account the site was using. Somehow an object for the services profile had an impossible permission setup (it had full read write modify all on an child object where it did not have read write modify all on the parent object (an opportunity)). So fixing the permissions and making sure the service had read/write to all objects and fields it required seems to have fixed this error. If you are getting this, make sure to check your object permissions and that everything the service needs is there, and that you don’t have some kind of weird setup issue like I did.