Oh my god. It's full of code!

Uncategorized

Simplification

Hey all,

So I wanted to just throw this out there, I’ve moved from Minnesota to VERY rural Montana. I traded in my 3 bedroom rambler for a studio cabin on some ranch near the Canadian border. As such my access to technology is somewhat reduced and I don’t know if I’ll be posting as much interesting stuff on this blog for a while. Odds are I’ll have some cool Salesforce stuff from time to time since I am maintaining my employment remotely but I won’t be doing as much at home hacking. If you are curious how things are going, why this happened or just like my writing style I’ve started a new blog detailing my journey. You can check it out here:

Montana Dan Blog

Anyway, I’ll still post what I can but I figured I’d should at least inform the community why I might not be around quite as much. Till next time.

-Kenji


Deep Clone (Round 2)

So a day or two ago I posted my first draft of a deep clone, which would allow easy cloning of an entire data hierarchy. It was a semi proof of concept thing with some limitations (it could only handle somewhat smaller data sets, and didn’t let you configure all or nothing inserts, or specify if you wanted to copy standard objects as well as custom or not). I was doing some thinking and I remembered hearing about the queueable interface, which allows for asynchronous processing and bigger governor limits. I started thinking about chaining queueable jobs together to allow for copying much larger data sets. Each invocation would get it’s own governor limits and could theoretically go on as long as it took since you can chain jobs infinitely. I had attempted to use queueable to solve this before but i made the mistake of trying to kick off multiple jobs per invocation (one for each related object type). This obviously didn’t work due to limits imposed on queueable. Once I thought of a way to only need one invocation per call (basically just rolling all the records that need to get cloned into one object and iterate over it) I figured I might have a shot at making this work. I took what I had written before, added a few options, and I think I’ve done it. An asynchronous deep clone that operates in distinct batches with all or nothing handling, and cleanup in case of error. This is some hot off the presses code, so there is likely some lingering bugs, but I was too excited not to share this. Feast your eyes!

public class deepClone implements Queueable {

    //global describe to hold object describe data for query building and relationship iteration
    public map<String, Schema.SObjectType> globalDescribeMap = Schema.getGlobalDescribe();
    
    //holds the data to be cloned. Keyed by object type. Contains cloneData which contains the object to clone, and some data needed for queries
    public map<string,cloneData> thisInvocationCloneMap = new map<string,cloneData>();
    
    //should the clone process be all or nothing?
    public boolean allOrNothing = false;
    
    //each iteration adds the records it creates to this property so in the event of an error we can roll it all back
    public list<id> allCreatedObjects = new list<id>();
    
    //only clone custom objects. Helps to avoid trying to clone system objects like chatter posts and such.
    public boolean onlyCloneCustomObjects = true;
    
    public static id clone(id sObjectId, boolean onlyCustomObjects, boolean allOrNothing)
    {
        
        deepClone startClone= new deepClone();
        startClone.onlyCloneCustomObjects  = onlyCustomObjects;
        startClone.allOrNothing = allOrNothing;
        
        sObject thisObject = sObjectId.getSobjectType().newSobject(sObjectId);
        cloneData thisClone = new cloneData(new list<sObject>{thisObject}, new map<id,id>());
        map<string,cloneData> cloneStartMap = new map<string,cloneData>();
        
        cloneStartMap.put(sObjectId.getSobjectType().getDescribe().getName(),thisClone);
        
        startClone.thisInvocationCloneMap = cloneStartMap;
        return System.enqueueJob(startClone);
        
        return null;      
    }
    
    public void execute(QueueableContext context) {
        deepCloneBatched();
    }
        
    /**
    * @description Clones an object and the entire related data hierarchy. Currently only clones custom objects, but enabling standard objects is easy. It is disabled because it increases risk of hitting governor limits
    * @param sObject objectToClone the root object be be cloned. All descended custom objects will be cloned as well
    * @return list<sobject> all of the objects that were created during the clone.
    **/
    public list<id> deepCloneBatched()
    {
        map<string,cloneData> nextInvocationCloneMap = new map<string,cloneData>();
        
        //iterate over every object type in the public map
        for(string relatedObjectType : thisInvocationCloneMap.keySet())
        { 
            list<sobject> objectsToClone = thisInvocationCloneMap.get(relatedObjectType).objectsToClone;
            map<id,id> previousSourceToCloneMap = thisInvocationCloneMap.get(relatedObjectType).previousSourceToCloneMap;
            
            system.debug('\n\n\n--------------------  Cloning record ' + objectsToClone.size() + ' records');
            list<id> objectIds = new list<id>();
            list<sobject> clones = new list<sobject>();
            list<sObject> newClones = new list<sObject>();
            map<id,id> sourceToCloneMap = new map<id,id>();
            list<database.saveresult> cloneInsertResult;
                       
            //if this function has been called recursively, then the previous batch of cloned records
            //have not been inserted yet, so now they must be before we can continue. Also, in that case
            //because these are already clones, we do not need to clone them again, so we can skip that part
            if(objectsToClone[0].Id == null)
            {
                //if they don't have an id that means these records are already clones. So just insert them with no need to clone beforehand.
                cloneInsertResult = database.insert(objectsToClone,allOrNothing);

                clones.addAll(objectsToClone);
                
                for(sObject thisClone : clones)
                {
                    sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
                }
                            
                objectIds.addAll(new list<id>(previousSourceToCloneMap.keySet()));
                //get the ids of all these objects.                    
            }
            else
            {
                //get the ids of all these objects.
                for(sObject thisObj :objectsToClone)
                {
                    objectIds.add(thisObj.Id);
                }
    
                //create a select all query to get all the data for these objects since if we only got passed a basic sObject without data 
                //then the clone will be empty
                string objectDataQuery = buildSelectAllStatment(relatedObjectType);
                
                //add a where condition
                objectDataQuery += ' where id in :objectIds';
                
                //get the details of this object
                list<sObject> objectToCloneWithData = database.query(objectDataQuery);
    
                for(sObject thisObj : objectToCloneWithData)
                {              
                    sObject clonedObject = thisObj.clone(false,true,false,false);
                    clones.add(clonedObject);               
                }    
                
                //insert the clones
                cloneInsertResult = database.insert(clones,allOrNothing);
                
                for(sObject thisClone : clones)
                {
                    sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
                }
            }        
            
            for(database.saveResult saveResult :  cloneInsertResult)
            {
                if(saveResult.success)
                {
                    allCreatedObjects.add(saveResult.getId());
                }
                else if(allOrNothing)
                {
                    cleanUpError();
                    return allCreatedObjects;
                }
            }
              
            //Describes this object type so we can deduce it's child relationships
            Schema.DescribeSObjectResult objectDescribe = globalDescribeMap.get(relatedObjectType).getDescribe();
                        
            //get this objects child relationship types
            List<Schema.ChildRelationship> childRelationships = objectDescribe.getChildRelationships();
    
            system.debug('\n\n\n-------------------- ' + objectDescribe.getName() + ' has ' + childRelationships.size() + ' child relationships');
            
            //then have to iterate over every child relationship type, and every record of that type and clone them as well. 
            for(Schema.ChildRelationship thisRelationship : childRelationships)
            { 
                          
                Schema.DescribeSObjectResult childObjectDescribe = thisRelationship.getChildSObject().getDescribe();
                string relationshipField = thisRelationship.getField().getDescribe().getName();
                
                try
                {
                    system.debug('\n\n\n-------------------- Looking at ' + childObjectDescribe.getName() + ' which is a child object of ' + objectDescribe.getName());
                    
                    if(!childObjectDescribe.isCreateable() || !childObjectDescribe.isQueryable())
                    {
                        system.debug('-------------------- Object is not one of the following: queryable, creatable. Skipping attempting to clone this object');
                        continue;
                    }
                    if(onlyCloneCustomObjects && !childObjectDescribe.isCustom())
                    {
                        system.debug('-------------------- Object is not custom and custom object only clone is on. Skipping this object.');
                        continue;                   
                    }
                    if(Limits.getQueries() >= Limits.getLimitQueries())
                    {
                        system.debug('\n\n\n-------------------- Governor limits hit. Must abort.');
                        
                        //if we hit an error, and this is an all or nothing job, we have to delete what we created and abort
                        if(!allOrNothing)
                        {
                            cleanUpError();
                        }
                        return allCreatedObjects;
                    }
                    //create a select all query from the child object type
                    string childDataQuery = buildSelectAllStatment(childObjectDescribe.getName());
                    
                    //add a where condition that will only find records that are related to this record. The field which the relationship is defined is stored in the maps value
                    childDataQuery+= ' where '+relationshipField+ ' in :objectIds';
                    
                    //get the details of this object
                    list<sObject> childObjectsWithData = database.query(childDataQuery);
                    
                    system.debug('\n\n\n-------------------- Object queried. Found ' + childObjectsWithData.size() + ' records to clone');
                    
                    if(!childObjectsWithData.isEmpty())
                    {               
                        map<id,id> childRecordSourceToClone = new map<id,id>();
                        
                        for(sObject thisChildObject : childObjectsWithData)
                        {
                            childRecordSourceToClone.put(thisChildObject.Id,null);
                            
                            //clone the object
                            sObject newClone = thisChildObject.clone();
                            
                            //since the record we cloned still has the original parent id, we now need to update the clone with the id of it's cloned parent.
                            //to do that we reference the map we created above and use it to get the new cloned parent.                        
                            system.debug('\n\n\n----------- Attempting to change parent of clone....');
                            id newParentId = sourceToCloneMap.get((id) thisChildObject.get(relationshipField));
                            
                            system.debug('Old Parent: ' + thisChildObject.get(relationshipField) + ' new parent ' + newParentId);
                            
                            //write the new parent value into the record
                            newClone.put(thisRelationship.getField().getDescribe().getName(),newParentId );
                            
                            //add this new clone to the list. It will be inserted once the deepClone function is called again. I know it's a little odd to not just insert them now
                            //but it save on redudent logic in the long run.
                            newClones.add(newClone);             
                        }  
                        cloneData thisCloneData = new cloneData(newClones,childRecordSourceToClone);
                        nextInvocationCloneMap.put(childObjectDescribe.getName(),thisCloneData);                             
                    }                                       
                       
                }
                catch(exception e)
                {
                    system.debug('\n\n\n---------------------- Error attempting to clone child records of type: ' + childObjectDescribe.getName());
                    system.debug(e); 
                }            
            }          
        }
        
        system.debug('\n\n\n-------------------- Done iterating cloneable objects.');
        
        system.debug('\n\n\n-------------------- Clone Map below');
        system.debug(nextInvocationCloneMap);
        
        system.debug('\n\n\n-------------------- All created object ids thus far across this invocation');
        system.debug(allCreatedObjects);
        
        //if our map is not empty that means we have more records to clone. So queue up the next job.
        if(!nextInvocationCloneMap.isEmpty())
        {
            system.debug('\n\n\n-------------------- Clone map is not empty. Sending objects to be cloned to another job');
            
            deepClone nextIteration = new deepClone();
            nextIteration.thisInvocationCloneMap = nextInvocationCloneMap;
            nextIteration.allCreatedObjects = allCreatedObjects;
            nextIteration.onlyCloneCustomObjects  = onlyCloneCustomObjects;
            nextIteration.allOrNothing = allOrNothing;
            id  jobId = System.enqueueJob(nextIteration);       
            
            system.debug('\n\n\n-------------------- Next queable job scheduled. Id is: ' + jobId);  
        }
        
        system.debug('\n\n\n-------------------- Cloneing Done!');
        
        return allCreatedObjects;
    }
     
    /**
    * @description create a string which is a select statement for the given object type that will select all fields. Equivalent to Select * from objectName ins SQL
    * @param objectName the API name of the object which to build a query string for
    * @return string a string containing the SELECT keyword, all the fields on the specified object and the FROM clause to specify that object type. You may add your own where statements after.
    **/
    public string buildSelectAllStatment(string objectName){ return buildSelectAllStatment(objectName, new list<string>());}
    public string buildSelectAllStatment(string objectName, list<string> extraFields)
    {       
        // Initialize setup variables
        String query = 'SELECT ';
        String objectFields = String.Join(new list<string>(globalDescribeMap.get(objectName).getDescribe().fields.getMap().keySet()),',');
        if(extraFields != null)
        {
            objectFields += ','+String.Join(extraFields,',');
        }
        
        objectFields = objectFields.removeEnd(',');
        
        query += objectFields;
    
        // Add FROM statement
        query += ' FROM ' + objectName;
                 
        return query;   
    }    
    
    public void cleanUpError()
    {
        database.delete(allCreatedObjects);
    }
    
    public class cloneData
    {
        public list<sObject> objectsToClone = new list<sObject>();        
        public map<id,id> previousSourceToCloneMap = new map<id,id>();  
        
        public cloneData(list<sObject> objects, map<id,id> previousDataMap)
        {
            this.objectsToClone = objects;
            this.previousSourceToCloneMap = previousDataMap;
        }   
    }    
}    

 

It’ll clone your record, your records children, your records children’s children’s, and yes even your records children’s children’s children (you get the point)! Simply invoke the deepClone.clone() method with the id of the object to start the clone process at, whether you want to only copy custom objects, and if you want to use all or nothing processing. Deep Clone takes care of the rest automatically handling figuring out relationships, cloning, re-parenting, and generally being awesome. As always I’m happy to get feedback or suggestions! Enjoy!

-Kenji


Amazon Alexa is going to run/ruin my life

It was my birthday recently, just turned 28. As a gift to myself I finally decided to order an Amazon Alexa cause I’ve wanted one since I heard about it a few months ago. If you aren’t familiar it’s basically like a ‘siri’ or ‘cortana’ thing that is a stand alone personal assistant device that lives in your home. It’s always on and responds to voice commands from surprisingly far away. It can tell you the weather, check your calendar, manage your shopping list and all that kind of nifty stuff. However, it can do more, much more. Thanks to the ability to develop custom ‘skills’ (their name for apps) and out of the box If This Then That (IFTTT) integration you can quickly start making Alexa do just about anything. I’ve owned it only a day now and I’ve already taught it two new tricks.

Also, if you aren’t familiar with IFTTT it’s an online service that basically allows you to create simple rules that perform actions (hence the name, if this then that). They have the ability to integrate all kinds of different services so you no longer have to be an advanced programmer to automate much of your life. It’s a cool free service and I’d highly recommend checking it out.

You may remember a while back I did that whole write about about making an automatic door locking service software to lock and unlock my front door. I figured a good way to jump into making custom commands would be if I could to see if I could teach Alexa to do it for me upon request. Turns out it was surprisingly easy. Since I already had the web service up and running to respond to HTTP post requests, I simply needed to create an IFTTT rule to send a request when Alexa heard a specific phrase. You may recall that I had some problems with IFTTT not seeming to work for me before, but it seems to now, might have been an error on my part for all I know. Here is the rule as it stands currently.

door 1door 2

Every command issued to Alexa starts with the ‘wake word’ in this case I’ve chosen Alexa (since you can only pick between Alexa, Echo, and Amazon). Second is the command to issue so it knows what service to route the request to. For this the command is ‘trigger’ so Alexa knows to send the request to IFTTT. Then you simple include the phrase to match, and what to do. I decided to make the phrase ‘lock the door’ which when that happens will send a post request to my web server is listening with the given JSON payload. Boom done.

The next thing I wanted to do, and this is still just a very rough outline of a final idea is Chromecast integration. Ideally I’d like to be able to say ‘Alexa trigger play netflix [moviename]’ but as of right now triggers created from IFTTT for Alexa can’t really contain variables aside from just the whole command itself. So I could do ‘Alexa trigger netflix bojack horseman’ and create a specific request just for that show, but there is no way to create a generic template kind of request and pass on the details to the web service that is listening. That aside, what I do have currently is a start.

I found a command line tool that can interact with the chromecast (check this guide for  Command Line Chromecast), and then created a execute statment to call that from my web service. My door lock and unlock service already has logic for handling different commands so I just created a new one called ‘play’ that plays my test video.

else if(action == 'play')
{
	console.log('Casting Requested Thing!');
	var exec = require('child_process').exec;
	var cmd = 'castnow c:\\cast\\testVideo.mp4 --device "Upstairs Living Room"';

	exec(cmd, function(error, stdout, stderr) {
	});					
}

So that turned out to be pretty easy. Small caveat being that castnow is more meant to be an application that is kept open and you interact with to control the video. Since it is being invoked via a web service call it doesn’t really get to ‘interact’ with it. I suppose you might be able to do some crazy shit like keeping open a web socket and continue to pass commands to it, but that’s for another day.

The IFTTT command is basically the same as the door lock one. Just change the command to trigger it, and change the JSON payload to have the action as “play” instead of “lock” or “unlock” and the command gets triggered. I also created a corollary rule and bit of code for stopping the casting of the current video by playing another empty video file (since there isn’t an explicit stop command in the castnow software).

There you have it, with Alexa, IFTTT, and a home web server you can start to do some pretty cool customized automation stuff. I think next up is getting it to order my favorite local pizza for me 😀


URL Encode Object/Simple Object Reflection in Apex

Hey all,

Kind of a quick yet cool post for you today. Have you ever wanted to be able to iterate over the properties of a custom class/object? Maybe wanted to read out all the values, or for some other reason (such as serializing the object perhaps) wanted to be able to figure out what all properties an object contained but couldn’t find a way? We all know Apex has come a long way, but it still is lacking a few core features, reflection being one of them. Recently I had a requirement were I wanted to be able to take an object and serialize it into URL format. I didn’t want to have to have to manually type out every property of the object since it could change, and I’m lazy like that. Without reflection this seems impossible, but it’s not!

Remembering that the deserialize json method that Apex has is capable of creating an iteratable version of an object by casting it into a list, or a map suddenly it becomes much more viable. Check it out.

 

    public static string urlEncodeObject(object objectToEncode)
    {
        string urlEncodedString;
        String serializedObject = JSON.serialize(objectToEncode);
        
        Map<String,Object> deserializedObject = (Map<String,Object>) JSON.deserializeUntyped(serializedObject);
        
        for(String key : deserializedObject.keySet())
        {
            urlEncodedString+= key+'='+string.valueOf(deserializedObject.get(key))+'&';
        }
        urlEncodedString = urlEncodedString.substring(0,urlEncodedString.length()-1);
        urlEncodedString = encodingUtil.urlEncode(urlEncodedString,'utf-8');
        return urlEncodedString;
    }       

There you have it. By simply simply serializing an object, then deserializing it, we can now iterate over it. Pretty slick eh? Not perfect I know, and doesn’t work awesome for complex objects, but it’s better than nothing until Apex introduces some real reflection abilities.


Using google forms and sheets as a data source for graphs

Hey all,

Long time no post! I’ve been on vacation and in general just being kind of lazy, but today I’ve got a simple fun project for us. You see, my girlfriend is always right, well almost always. Very rarely I’ll remember something correctly, but in general she’s always correct (and not in the ‘haha men are so dumb, women know everything’ way, actually legit she remembers way more stuff than me). This phenomenon has gotten so pervasive that I just for kicks wanted to create a live chart running in the house display how often either of us was right about stuff (I know I’ll regret this eventually).  So for my mini project I had a few goals

1) Have a live chart that updates automatically on a TV in my house (we have an extra TV that we generally just use a media center/music streaming box via a chomecast)

2) Make an easy interface to add new data to the chart

3) Make the chart slick looking

4) Keep it simple. This is basically a hobby project so I don’t want to go too nuts.

Before we get started, you can see the demo here:
http://xerointeractive-developer-edition.na9.force.com/partyForce/RightChart

Please close it when you are done though, my dev org only gets so many HTTP requests per day (note to self, add some kind of global request caching or something).

I was able to complete this project in about an hour and a half and meet all my goals. So now I’ll show you how.

Right off the bat I had a general idea of how I would do this (though the approach did morph a bit). From a previous project I knew it was possible that store and retrieve data in a google spreadsheet. You can get the raw CSV data by using a special URL, and them import that via an http request from an Apex controller. I figured this was easier than setting up a salesforce object, creating a custom interface for adding data, and hell it’s cool to be able to utilize google forms data for something.

form

My basic form for collecting data

From there it’s just a matter of passing the data to a chart system, and making it poll the sheet occasionally. So anyway, first off we are going to need a google form to collect our data. Head to google docs, and create a new spreadsheet. Use the forms menu to create a new form for your page. In my case, it’s just a simple single question multiple choice (with an other option). Each time the form is submitted it puts the name, and a timestamp into a sheet called ‘Form Responses 1’. This data format works pretty well. I played around with trying to create another sheet that used queryIf to sum all the times various names appeared in the sheet, but that approach had a limiting factor of only working for names I pre-coded it for. It wasn’t dynamic enough. So I decided to just let google collect the data, and I’d handle the summing and formatting in my code.

sheet1

Your form should be gathering data in a way that looks something like this

To actually get the data in a usable form for programming, we need a raw csv version of it. Thankfully google will provide this for you (though they aren’t exactly forthcoming with it). As of this writting, so get the raw CSV of your sheet, go to file and hit publish. Just publish the one sheet. You should be given a shareable url with a long unique looking id string. Take that and put it into this URL format

https://docs.google.com/spreadsheets/d/key/export?format=csv&id=key

Just replace the word key with your documents unique ID. You should be able to put that URL in your browser and it should automatically attempt to download your spreadsheet in CSV format. If so, you are in good shape. If not, make sure you published it, and it’s shared and all that good stuff. Once you have that working we can move to the next step.

publish

Publish your form results sheet and make note of that unique ID, you’ll need it!

So now that the data exists and is accessible we need to GET it. I decided because it’s the easiest publishing platform I know I’d just use Salesforce sites. So that means Apex is going to be my back end. So I’ll need an Apex call to fetch the CSV data from the google sheet, and some code to parse that CSV into some kind of logical structure. Again thankfully from past projects, I had just such a a class.

//gets CSV data from a given URL and parses it into a list of lists
global class RightChartController 
{

    public String getDataSourceUrl() {
        return 'Your google document url here';
    }

   

    //gets CSV data from a given source
    @remoteAction
    global static  List<List<String>> importCSV(string url)
    {
         List<List<String>> result = new List<List<String>>(); 
        try
        {
            string responseBody;
            
            //create http request to get import data from
            HttpRequest req = new HttpRequest();
            req.setEndpoint(url);
            req.setMethod('GET');         
            Http http = new Http();
            
            //if this is not a test actually send the http request. if it is a test, hard code the returned results.
            if(!Test.isRunningTest())
            {
                HTTPResponse res = http.send(req);
                responseBody = res.getBody();
            }
            else
            {
                responseBody = 'Name,Count\ntammy,10\njoe,5\nFrank,0';
            }
            
            //the data should come back in in CSV format, so hand it off the the parsing function which will make a list of a list of strings (each list is one row, each item within that sub list is one column)
            result = RightChartController.parseCSV (responseBody,true);
        }
        catch(exception e)
        {
            system.debug('\n\n\n\n----------------------------- Error importing chart data. ' + e.getMessage() + ' on line ' + e.getLineNumber());
        }
        return result;
    }
    
    //parses a csv file. REturns a list of lists. Each main list is a row, and the list contained is all the columns.
    public static List<List<String>> parseCSV(String contents,Boolean skipHeaders)
    {
        List<List<String>> allFields = new List<List<String>>();
    
        // replace instances where a double quote begins a field containing a comma
        // in this case you get a double quote followed by a doubled double quote
        // do this for beginning and end of a field
        contents = contents.replaceAll(',"""',',"DBLQT').replaceall('""",','DBLQT",');
        // now replace all remaining double quotes - we do this so that we can reconstruct
        // fields with commas inside assuming they begin and end with a double quote
        contents = contents.replaceAll('""','DBLQT');
        // we are not attempting to handle fields with a newline inside of them
        // so, split on newline to get the spreadsheet rows
        List<String> lines = new List<String>();
        try {
            lines = contents.split('\n');
        } catch (System.ListException e) {
            System.debug('Limits exceeded?' + e.getMessage());
        }
        Integer num = 0;
        for(String line : lines) {
            // check for blank CSV lines (only commas)
            if (line.replaceAll(',','').trim().length() == 0) break;
            
            List<String> fields = line.split(',');  
            List<String> cleanFields = new List<String>();
            String compositeField;
            Boolean makeCompositeField = false;
            for(String field : fields) {
                if (field.startsWith('"') && field.endsWith('"')) {
                    cleanFields.add(field.replaceAll('DBLQT','"'));
                } else if (field.startsWith('"')) {
                    makeCompositeField = true;
                    compositeField = field;
                } else if (field.endsWith('"')) {
                    compositeField += ',' + field;
                    cleanFields.add(compositeField.replaceAll('DBLQT','"'));
                    makeCompositeField = false;
                } else if (makeCompositeField) {
                    compositeField +=  ',' + field;
                } else {
                    cleanFields.add(field.replaceAll('DBLQT','"'));
                }
            }
            
            allFields.add(cleanFields);
        }
        if (skipHeaders) allFields.remove(0);
        return allFields;       
    }
}

So now we’ve got the back end code that is required to both get the data and parse it (Don’t forget to add a remote site exception in your Salesforce security controls for docs.google.com!). Now we just need an interface to use that data and display it in a nifty chart. Using highcharts this is pretty easy. Mine ended up looking something like this (You don’t have to tell me the code is kind of sloppy, this was just a quick throw together project).

<apex:page controller="RightChartController" sidebar="false" showHeader="false" standardStylesheets="false">
    <script src="//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
    <script src="https://code.highcharts.com/highcharts.js"></script>
    <script src="https://code.highcharts.com/highcharts-3d.js"></script>
    <script>
        //load the document source  locally incase we want to let the user change it or something later
        var docSource = '{!dataSourceUrl}';
        var chart;
        
        //fetches the data from the google sheet
        function getData(docSource,callback)
        {
           Visualforce.remoting.Manager.invokeAction(
                '{!$RemoteAction.RightChartController.importCSV}', 
                docSource,
                function(result, event){
                    if (event.status) {
                        callback(result);
                    }
                }, 
                {escape: true}
            );   
     
        }
        
        //massages the data from being an array of arrays (one line per form entry) into an array of objects with totals
        //should probably be refactored to make it more efficient, but whatever.
        function translateDataToHighChartFormat(csvData)
        {
            var chartData = new Array();
            var totals = new Object();
            
            for(var i = 0; i < csvData.length; i++)
            {
                var timestamp = csvData[i][0];
                var name = csvData[i][1];
                 
                if(totals.hasOwnProperty(name))
                {
                    totals[name]++;
                }
                else
                {
                    totals[name] = 1;
                }
            }
            
            for(key in totals)
            {
                var thisPoint = new Object();
                thisPoint.name = key;
                thisPoint.y = totals[key];
                chartData.push(thisPoint);
            }
            
            return chartData;
        }
        
        //create the chart on document load
        $(function () 
        {
            chart = new Highcharts.Chart({
                chart: {
                    type: 'pie',
                    options3d: {
                        enabled: true,
                        alpha: 45,
                        beta: 0,
                    },
                    renderTo: 'container'
                },
                title: {
                    text: 'Told You So'
                },                
                plotOptions: {
                    pie: {
                        depth: 25
                    }
                },
                series: [{
                    data: []
                }]
            });
            
            //set interval timer to poll the document every 10 seconds
            setInterval(function(){
                getData(docSource,function(result){
                    chart.series[0].setData(translateDataToHighChartFormat(result));
                    
                });
            },10000);
            
            //get the data one initially so we don't have to wait for the first delay to get data
            getData(docSource,function(result){
                chart.series[0].setData(translateDataToHighChartFormat(result));
                $('#Loading').hide();
            });
        });    
    </script>
    <div id="container" style="height: 400px"></div>
    <div id="Loading" style="text-align:center; font-weight:bold; font-size: 24px">Loading Chart Data Please Wait</div>
</apex:page>

If everything has gone smoothly, you should end up with something that looks like this

chart

With our page alive, it’s a simple matter to add it to a Salesforce site. Anyone can view it, and anyone you give the form link to will be able to add data to it. As data is added the chart will automatically redraw itself every 10 seconds with the new data set. Then it was just a simple matter of having the chart open on some computer and using the chrometab app for chrome to send it to my chromecast. Now we can be reminded of how stupid I am all the time….. what have I done?


Stripping Nulls from a JSON object in Apex

NOTE: If you don’t’ want to read the wall of text/synopsis/description just scroll to the bottom. The function you need is there.

I feel dirty. This is the grossest hack I have had to write in a while, but it is also too useful not to share (I think). Salesforce did us an awesome favor by introducing the JSON.serialize utility, it can take any object and serialize it into JSON which is great! The only problem is that you have no control over the output JSON, the method takes no params except for the source object. Normally this wouldn’t be a big deal, I mean there isn’t a lot to customize about JSON usually, it just is what it is. There is however one case when you may want to control the output, and that is in the case of nulls. You see most of the time when you are sending JSON to a remote service, if you have a param specified as null, it will just skip over it as it should. Some of the stupider APIs try and process that null as if it were a value. This is especially annoying when the API has optional parameters and you are using a language like Apex which being strongly types makes it very difficult to modify an object during run time to remove a property. For example, say I am ordering a pizza, via some kind of awesome pizza ordering API. The API might take a size, some toppings, and a desired delivery time (for future deliveries). Their API documentation states that delivery time is an optional param, and if not specified it will be delivered as soon as possible, which is nice. So I write my little class in apex

    public class pizzaOrder
    {
    	public string size;
    	public list<string> toppings;
    	public datetime prefferedDeliveryTime;
    
    }
    
    public static string orderPizza(string size, list<string> toppings, datetime prefferedDeliveryTime)
    {
    	pizzaOrder thisOrder = new pizzaOrder();
    	thisOrder.size = size;
    	thisOrder.toppings = toppings;
    	thisOrder.prefferedDeliveryTime	= prefferedDeliveryTime;
    	
    	string jsonOrderString = JSON.serialize(thisOrder);
    	
   
    }
    
    list<string> toppings = new list<string>();
    toppings.add('cheese');
    toppings.add('black olives');
    toppings.add('jalepenos');
                     
    orderPizza('large', toppings, null);

And your resulting JSON looks like

{“toppings”:[“cheese”,”black olives”,”jalepenos”],”size”:”large”,”prefferedDeliveryTime”:null}

Which in would work beautifully, unless the Pizza API is setup to treat any present key in the JSON object as an actual value, which in that case would be null. The API would freak out saying that null isn’t a valid datetime, and you are yelling at the screen trying to figure out why the stupid API can’t figure out that if an optional param has a null value, to just skip it instead of trying to evaluate it.

Now in this little example you could easily work around the issue by just specifying the prefferedDeliveryTime as the current date time if the user didn’t pass one in. Not a big deal. However, what if there was not a valid default value to use? In my recent problem there is an optional account number I can pass in to the API. If I pass it in, it uses that. If I don’t, it uses the account number setup in the system. So while I want to support the ability to pass in an account number, if the user doesn’t enter one my app will blow up because when the API encounters a null value for that optional param it explodes. I can’t not have a property for the account number because I might need it, but including it as a null (the user just wants to use the default, which Salesforce has no idea what is) makes the API fail. Ok, whew, so now hopefully we all understand the problem. Now what the hell do we do about it?

While trying to solve this, I explored a few different options. At first I thought of deserialize the JSON object back into a generic object (map<string,object>) and check for nulls in any of the key/value pairs, remove them then serialize the result. This failed due to difficulties with detecting the type of object the value was (tons of ‘unable to convert list<any> to map<string,object> errors that I wasn’t’ able to resolve). Of course you also have the recursion issue since you’ll need to look at every element in the entire object which could be infinity deep/complex so that adds another layer of complexity. Not impossible, but probably not super efficient and I couldn’t even get it to work. Best of luck if anyone else tries.

The next solution I investigated was trying to write my own custom JSON generator that would just not put nulls in the object in the first place. This too quickly fell apart, because I needed a generic function that could take string or object (not both, just a generic thing of some kind) and turn it into JSON, since this function would have to be used to strip nulls from about 15 different API calls. I didn’t look super hard at this because all the code I saw looked really messy and I just didn’t like it.

My solution that I finally decided to go for, while gross, dirty, hackish and probably earned me a spot in programmer hell is also simple and efficient. Once I remembered that JSON is just a string, and can be manipulated as such, I started thinking about maybe using regex (yes I am aware when you solve one problem with regex now you have two) to just strip out nulls. Of course then you have to worry about cleaning up syntax (extra commas, commas against braces, etc) when just just rip elements out of the JSON string, but I think I’ve got a little function here that will do the job, at least until salesforce offeres a ‘Don’t serialize nulls’ option in their JSON serializer.

    public static string stripJsonNulls(string JsonString)
    {

    	if(JsonString != null)   	
    	{
			JsonString = JsonString.replaceAll('\"[^\"]*\":null',''); //basic removeal of null values
			JsonString = JsonString.replaceAll(',{2,}', ','); //remove duplicate/multiple commas
			JsonString = JsonString.replace('{,', '{'); //prevent opening brace from having a comma after it
			JsonString = JsonString.replace(',}', '}'); //prevent closing brace from having a comma before it
			JsonString = JsonString.replace('[,', '['); //prevent opening bracket from having a comma after it
			JsonString = JsonString.replace(',]', ']'); //prevent closing bracket from having a comma before it
    	}
  	
	return JsonString;
    }

Which after running on our previously generated JSON we get

{“toppings”:[“cheese”,”black olives”,”jalepenos”],”size”:”large”}

Notice, no null prefferedDeliveryTime key. It’s  not null, its just non existent. So there you have it, 6 lines of find and replace to remove nulls from your JSON object. Yes, you could combine them and probably make it a tad more efficient. I went for readability here. So sue me. Anyway, hope this helps someone out there, and if you end up using this, I’m sure I’ll see you in programmer hell at some point. Also, if anyone can make my initial idea of recursively spidering the JSON object and rebuilding it as a map of <string,object> without the nulls, I’d be most impressed.


Super Handy Mass Deploy Tool

So I know it has been a while. I’m not dead I promise, just busy. Busy with trying to keep about a thousand orgs in sync, pushing code changes, layout changes, all kinds of junk from one source org to a ton of other orgs. I know you are saying ‘just use managed packages, or change sets’. Manages packages can be risky early in the dev process because you usually can’t remove components and things and you get locked into a bit of  a structure that you might not quite be settled on. Change sets are great, but many of these orgs are not linked, they are completely disparate for different clients. Over the course of the last month or two it’s become apparant that just shuffling data around in Eclipse wasn’t going to do it anymore. I was going to have to break into using ANT and the Salesforce migration tool.

For those unaware, ANT is some kind of magical command line tool that is used by the Salesforce migration tool (or maybe vice versa, not really sure the relationship there) but when they work together it allows you to script deployments which can be pretty useful. Normally though, trying to actually setup the deployment with ANT is a huge pain in the butt because you have to be modifying XML files, setting up build files and stuff, in general it’s kind of slow to do. However, if you could write a script to write the needed files by the deployment script, now that would be handy. That is where this tool I wrote comes in. Now don’t get me wrong, it’s nothing fancy. It just helps make generating deployments a little easier. What it does is allows you to specify a list of orgs and their credentials that you want to deploy to. In the deploy folder you place the package.xml file that contains the definitions of what you want to deploy, and the meta data itself (classes, triggers, objects, etc). Then when you run the program one by one it will log into each org, back it up, then deploy your package contents. It’s a nice set it and forget it way of deploying to numerous orgs in one go.

So here is what we are going to do, first of all, you are going to need to make sure you have a Java Runtime Enviornment (JRE), and the Java Developers Kit (JDK) Installed. Make sure to set your JAVA_HOME environment variable path to wherever the JDK library is installed (for me it was C:\Program Files\Java\jdk1.8.0_05). Then grab ANT and follow it’s guide for install. Then grab the Force.com migration tool and get that installed in your ANT setup. Then last, grab my SF Deploy Tool from bitbucket (https://Daniel_Llewellyn@bitbucket.org/Daniel_Llewellyn/sf-deploy-tool.git)

Now we have all the tools we need to deploy some components, but we don’t have anything to deploy, and we haven’t setup who we are going to deploy it to. So lets use Eclipse to grab our deploy-able contents and generate our package.xml file (which contains the list of stuff to deploy). Fire up Eclipse and create a new project. For the project contents, select whatever you want to deploy to your target orgs. This is why using a package is useful because it simplifies this process. Let the IDE download all the files for your project then navigate to the project contents folder on your computer. Copy everything inside the src folder, including that package.xml file. Then paste it into the deploy folder of my SF deploy tool. This is the payload that will be pushed to your orgs.

The last step in our setup is to tell the deploy tool which orgs to push this content into. Open the orgs.txt file in the SF Deployer folder and enter the required information. One org per line. Each org requires a username, password, token, url and name attribute, separated by semincolons with an equal sign used to denote the key/value. EX

username=xxxx;password=xxxxx;token=xxxxxxxxx;url=https://login.salesforce.com;name=TEST ORG

Now with all your credentials saved, you can run the SalesforceMultiDeploy.exe utility. It will one by one iterate over each org, back up the org, the deploy your changes. The console window will keep you informed of it’s progress as it goes and let you know when it’s all done. Of course this process is still subject to all the normal deploy problems you can encounter, but if everything in the target orgs is prepared to accept your deployment package, this can make life much easier. You could for example write another small script that copies the content from your source org at the end of each week, slaps it into the deploy folder, then invokes the deployment script to have an automated process that keeps your orgs in sync.

Also I just threw this tool together quickly and would love some feedback. So either fork it and change it, or just give me ideas and I’ll do my best to implement them (one thing I really want to do is make this multi threaded so that it can do deployments in parallel instead of serial, which would be a huge bonus for deployment speeds). Anyway as always, I hope this is useful, and I’ll catch ya next time.

-Kenji