Oh my god. It's full of code!

Posts tagged “apex

Dynamic Apex Invocation/Callbacks

So I’ve been working on that DeepClone class and it occurred to me that whatever invokes that class might like to know when the process is done (so maybe it can do something with those created records). Seeing as the DeepClone is by it’s very nature asynchronous that presents a problem, since the caller cannot sit and wait for process to complete. You know what other language has to deal with async issues a lot? Javascript. In Javascript we often solve this problem with a ‘callback’ function (I know callbacks are old and busted, promises are the new hotness but bare with me here), where in you call your asynchronous function and tell it what to call when it’s done. Most often that is done by passing in the actual function code instead of just the name, but both are viable. Here is an example of what both might look like.

var someData = 'data to give to async function';

//first type of invocation passes in an actual function as the callback. 
asyncThing(someData,function(result){
	console.log('I passed in a function directly!' + result);
});

//second type of invocation passes in the name of a function to call instead
asyncThing(someData,'onCompleteHandler');

function onCompleteHandler(result)
{
	console.log('I passed in the name of a function to call and that happened' + result);
}

function asyncThing(data,callback)
{
	//async code here, maybe a callout or something.
	var data = 'probably  a status code or the fetched data would go here';
	
	//if our callback is a function, then just straight up invoke it
	if(typeof callback == 'function')
	{
		callback(data);
	}
	//if our callback is a string, then dynamically invoke it
	else if(typeof callback == 'string')
	{
		window[callback](data);
	}
}

So yeah, javascript is cool, it has callbacks. What does this have to do with Apex? Apex is strongly typed, you can’t just go around passing around functions as arguments, and you sure as hell can’t do dynamic invocation… or can you? Behold, by abusing the tooling api, I give you a basic implementation of a dynamic Apex callback!

public HttpResponse invokeCallback(string callback, string dataString)
{
	HttpResponse res = new HttpResponse();
	try
	{
		string functionCall = callback+'(\''+dataString,',')+'\');';
		HttpRequest req = new HttpRequest();
		req.setHeader('Authorization', 'Bearer ' + UserInfo.getSessionID());
		req.setHeader('Content-Type', 'application/json');
		string instanceURL = System.URL.getSalesforceBaseUrl().getHost().remove('-api' ).toLowerCase();
		String toolingendpoint = 'https://'+instanceURL+'/services/data/v28.0/tooling/executeAnonymous/?anonymousBody='+encodingUtil.urlEncode(functionCall,'utf-8');
		req.setEndpoint(toolingendpoint);
		req.setMethod('GET');
		
		Http h = new Http();
		res = h.send(req);
	}
	catch(exception e)
	{
		system.debug('\n\n\n\n--------------------- Error attempting callback!');
		system.debug(e);
		system.debug(res);
	}
	return res;
} 

What’s going on here? The Tooling API allows us to execute anonymous code. Normally the Tooling API is for external tools/languages to access Salesforce meta-data and perform operations. However, by accessing it via REST and passing in both the name of a class and method, and properly encoding any data you’d like to pass (strings only, no complex object types) you can provide a dynamic callback specified at runtime. We simply create a get request against the Tooling API REST endpoint, and invoke the execute anonymous method. Into that we pass the desired callback function name. So now when DeepClone for example is instantiated the caller can set a class level property of class and method it would like called when DeepClone is done doing it’s thing. It can pass back all the Id’s of the records created so then any additional work can be performed. Of course the class provided has to be public, and the method called must be static. Additionally you have to add your own org id to the allowed remote sites under security->remote site settings. Anyway, I thought this was a pretty nice way of letting your @future methods and your queueable methods to pass information back to a class so you aren’t totally left in the dark about what the results were. Enjoy!


Deep Clone (Round 2)

So a day or two ago I posted my first draft of a deep clone, which would allow easy cloning of an entire data hierarchy. It was a semi proof of concept thing with some limitations (it could only handle somewhat smaller data sets, and didn’t let you configure all or nothing inserts, or specify if you wanted to copy standard objects as well as custom or not). I was doing some thinking and I remembered hearing about the queueable interface, which allows for asynchronous processing and bigger governor limits. I started thinking about chaining queueable jobs together to allow for copying much larger data sets. Each invocation would get it’s own governor limits and could theoretically go on as long as it took since you can chain jobs infinitely. I had attempted to use queueable to solve this before but i made the mistake of trying to kick off multiple jobs per invocation (one for each related object type). This obviously didn’t work due to limits imposed on queueable. Once I thought of a way to only need one invocation per call (basically just rolling all the records that need to get cloned into one object and iterate over it) I figured I might have a shot at making this work. I took what I had written before, added a few options, and I think I’ve done it. An asynchronous deep clone that operates in distinct batches with all or nothing handling, and cleanup in case of error. This is some hot off the presses code, so there is likely some lingering bugs, but I was too excited not to share this. Feast your eyes!

public class deepClone implements Queueable {

    //global describe to hold object describe data for query building and relationship iteration
    public map<String, Schema.SObjectType> globalDescribeMap = Schema.getGlobalDescribe();
    
    //holds the data to be cloned. Keyed by object type. Contains cloneData which contains the object to clone, and some data needed for queries
    public map<string,cloneData> thisInvocationCloneMap = new map<string,cloneData>();
    
    //should the clone process be all or nothing?
    public boolean allOrNothing = false;
    
    //each iteration adds the records it creates to this property so in the event of an error we can roll it all back
    public list<id> allCreatedObjects = new list<id>();
    
    //only clone custom objects. Helps to avoid trying to clone system objects like chatter posts and such.
    public boolean onlyCloneCustomObjects = true;
    
    public static id clone(id sObjectId, boolean onlyCustomObjects, boolean allOrNothing)
    {
        
        deepClone startClone= new deepClone();
        startClone.onlyCloneCustomObjects  = onlyCustomObjects;
        startClone.allOrNothing = allOrNothing;
        
        sObject thisObject = sObjectId.getSobjectType().newSobject(sObjectId);
        cloneData thisClone = new cloneData(new list<sObject>{thisObject}, new map<id,id>());
        map<string,cloneData> cloneStartMap = new map<string,cloneData>();
        
        cloneStartMap.put(sObjectId.getSobjectType().getDescribe().getName(),thisClone);
        
        startClone.thisInvocationCloneMap = cloneStartMap;
        return System.enqueueJob(startClone);
        
        return null;      
    }
    
    public void execute(QueueableContext context) {
        deepCloneBatched();
    }
        
    /**
    * @description Clones an object and the entire related data hierarchy. Currently only clones custom objects, but enabling standard objects is easy. It is disabled because it increases risk of hitting governor limits
    * @param sObject objectToClone the root object be be cloned. All descended custom objects will be cloned as well
    * @return list<sobject> all of the objects that were created during the clone.
    **/
    public list<id> deepCloneBatched()
    {
        map<string,cloneData> nextInvocationCloneMap = new map<string,cloneData>();
        
        //iterate over every object type in the public map
        for(string relatedObjectType : thisInvocationCloneMap.keySet())
        { 
            list<sobject> objectsToClone = thisInvocationCloneMap.get(relatedObjectType).objectsToClone;
            map<id,id> previousSourceToCloneMap = thisInvocationCloneMap.get(relatedObjectType).previousSourceToCloneMap;
            
            system.debug('\n\n\n--------------------  Cloning record ' + objectsToClone.size() + ' records');
            list<id> objectIds = new list<id>();
            list<sobject> clones = new list<sobject>();
            list<sObject> newClones = new list<sObject>();
            map<id,id> sourceToCloneMap = new map<id,id>();
            list<database.saveresult> cloneInsertResult;
                       
            //if this function has been called recursively, then the previous batch of cloned records
            //have not been inserted yet, so now they must be before we can continue. Also, in that case
            //because these are already clones, we do not need to clone them again, so we can skip that part
            if(objectsToClone[0].Id == null)
            {
                //if they don't have an id that means these records are already clones. So just insert them with no need to clone beforehand.
                cloneInsertResult = database.insert(objectsToClone,allOrNothing);

                clones.addAll(objectsToClone);
                
                for(sObject thisClone : clones)
                {
                    sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
                }
                            
                objectIds.addAll(new list<id>(previousSourceToCloneMap.keySet()));
                //get the ids of all these objects.                    
            }
            else
            {
                //get the ids of all these objects.
                for(sObject thisObj :objectsToClone)
                {
                    objectIds.add(thisObj.Id);
                }
    
                //create a select all query to get all the data for these objects since if we only got passed a basic sObject without data 
                //then the clone will be empty
                string objectDataQuery = buildSelectAllStatment(relatedObjectType);
                
                //add a where condition
                objectDataQuery += ' where id in :objectIds';
                
                //get the details of this object
                list<sObject> objectToCloneWithData = database.query(objectDataQuery);
    
                for(sObject thisObj : objectToCloneWithData)
                {              
                    sObject clonedObject = thisObj.clone(false,true,false,false);
                    clones.add(clonedObject);               
                }    
                
                //insert the clones
                cloneInsertResult = database.insert(clones,allOrNothing);
                
                for(sObject thisClone : clones)
                {
                    sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
                }
            }        
            
            for(database.saveResult saveResult :  cloneInsertResult)
            {
                if(saveResult.success)
                {
                    allCreatedObjects.add(saveResult.getId());
                }
                else if(allOrNothing)
                {
                    cleanUpError();
                    return allCreatedObjects;
                }
            }
              
            //Describes this object type so we can deduce it's child relationships
            Schema.DescribeSObjectResult objectDescribe = globalDescribeMap.get(relatedObjectType).getDescribe();
                        
            //get this objects child relationship types
            List<Schema.ChildRelationship> childRelationships = objectDescribe.getChildRelationships();
    
            system.debug('\n\n\n-------------------- ' + objectDescribe.getName() + ' has ' + childRelationships.size() + ' child relationships');
            
            //then have to iterate over every child relationship type, and every record of that type and clone them as well. 
            for(Schema.ChildRelationship thisRelationship : childRelationships)
            { 
                          
                Schema.DescribeSObjectResult childObjectDescribe = thisRelationship.getChildSObject().getDescribe();
                string relationshipField = thisRelationship.getField().getDescribe().getName();
                
                try
                {
                    system.debug('\n\n\n-------------------- Looking at ' + childObjectDescribe.getName() + ' which is a child object of ' + objectDescribe.getName());
                    
                    if(!childObjectDescribe.isCreateable() || !childObjectDescribe.isQueryable())
                    {
                        system.debug('-------------------- Object is not one of the following: queryable, creatable. Skipping attempting to clone this object');
                        continue;
                    }
                    if(onlyCloneCustomObjects && !childObjectDescribe.isCustom())
                    {
                        system.debug('-------------------- Object is not custom and custom object only clone is on. Skipping this object.');
                        continue;                   
                    }
                    if(Limits.getQueries() >= Limits.getLimitQueries())
                    {
                        system.debug('\n\n\n-------------------- Governor limits hit. Must abort.');
                        
                        //if we hit an error, and this is an all or nothing job, we have to delete what we created and abort
                        if(!allOrNothing)
                        {
                            cleanUpError();
                        }
                        return allCreatedObjects;
                    }
                    //create a select all query from the child object type
                    string childDataQuery = buildSelectAllStatment(childObjectDescribe.getName());
                    
                    //add a where condition that will only find records that are related to this record. The field which the relationship is defined is stored in the maps value
                    childDataQuery+= ' where '+relationshipField+ ' in :objectIds';
                    
                    //get the details of this object
                    list<sObject> childObjectsWithData = database.query(childDataQuery);
                    
                    system.debug('\n\n\n-------------------- Object queried. Found ' + childObjectsWithData.size() + ' records to clone');
                    
                    if(!childObjectsWithData.isEmpty())
                    {               
                        map<id,id> childRecordSourceToClone = new map<id,id>();
                        
                        for(sObject thisChildObject : childObjectsWithData)
                        {
                            childRecordSourceToClone.put(thisChildObject.Id,null);
                            
                            //clone the object
                            sObject newClone = thisChildObject.clone();
                            
                            //since the record we cloned still has the original parent id, we now need to update the clone with the id of it's cloned parent.
                            //to do that we reference the map we created above and use it to get the new cloned parent.                        
                            system.debug('\n\n\n----------- Attempting to change parent of clone....');
                            id newParentId = sourceToCloneMap.get((id) thisChildObject.get(relationshipField));
                            
                            system.debug('Old Parent: ' + thisChildObject.get(relationshipField) + ' new parent ' + newParentId);
                            
                            //write the new parent value into the record
                            newClone.put(thisRelationship.getField().getDescribe().getName(),newParentId );
                            
                            //add this new clone to the list. It will be inserted once the deepClone function is called again. I know it's a little odd to not just insert them now
                            //but it save on redudent logic in the long run.
                            newClones.add(newClone);             
                        }  
                        cloneData thisCloneData = new cloneData(newClones,childRecordSourceToClone);
                        nextInvocationCloneMap.put(childObjectDescribe.getName(),thisCloneData);                             
                    }                                       
                       
                }
                catch(exception e)
                {
                    system.debug('\n\n\n---------------------- Error attempting to clone child records of type: ' + childObjectDescribe.getName());
                    system.debug(e); 
                }            
            }          
        }
        
        system.debug('\n\n\n-------------------- Done iterating cloneable objects.');
        
        system.debug('\n\n\n-------------------- Clone Map below');
        system.debug(nextInvocationCloneMap);
        
        system.debug('\n\n\n-------------------- All created object ids thus far across this invocation');
        system.debug(allCreatedObjects);
        
        //if our map is not empty that means we have more records to clone. So queue up the next job.
        if(!nextInvocationCloneMap.isEmpty())
        {
            system.debug('\n\n\n-------------------- Clone map is not empty. Sending objects to be cloned to another job');
            
            deepClone nextIteration = new deepClone();
            nextIteration.thisInvocationCloneMap = nextInvocationCloneMap;
            nextIteration.allCreatedObjects = allCreatedObjects;
            nextIteration.onlyCloneCustomObjects  = onlyCloneCustomObjects;
            nextIteration.allOrNothing = allOrNothing;
            id  jobId = System.enqueueJob(nextIteration);       
            
            system.debug('\n\n\n-------------------- Next queable job scheduled. Id is: ' + jobId);  
        }
        
        system.debug('\n\n\n-------------------- Cloneing Done!');
        
        return allCreatedObjects;
    }
     
    /**
    * @description create a string which is a select statement for the given object type that will select all fields. Equivalent to Select * from objectName ins SQL
    * @param objectName the API name of the object which to build a query string for
    * @return string a string containing the SELECT keyword, all the fields on the specified object and the FROM clause to specify that object type. You may add your own where statements after.
    **/
    public string buildSelectAllStatment(string objectName){ return buildSelectAllStatment(objectName, new list<string>());}
    public string buildSelectAllStatment(string objectName, list<string> extraFields)
    {       
        // Initialize setup variables
        String query = 'SELECT ';
        String objectFields = String.Join(new list<string>(globalDescribeMap.get(objectName).getDescribe().fields.getMap().keySet()),',');
        if(extraFields != null)
        {
            objectFields += ','+String.Join(extraFields,',');
        }
        
        objectFields = objectFields.removeEnd(',');
        
        query += objectFields;
    
        // Add FROM statement
        query += ' FROM ' + objectName;
                 
        return query;   
    }    
    
    public void cleanUpError()
    {
        database.delete(allCreatedObjects);
    }
    
    public class cloneData
    {
        public list<sObject> objectsToClone = new list<sObject>();        
        public map<id,id> previousSourceToCloneMap = new map<id,id>();  
        
        public cloneData(list<sObject> objects, map<id,id> previousDataMap)
        {
            this.objectsToClone = objects;
            this.previousSourceToCloneMap = previousDataMap;
        }   
    }    
}    

 

It’ll clone your record, your records children, your records children’s children’s, and yes even your records children’s children’s children (you get the point)! Simply invoke the deepClone.clone() method with the id of the object to start the clone process at, whether you want to only copy custom objects, and if you want to use all or nothing processing. Deep Clone takes care of the rest automatically handling figuring out relationships, cloning, re-parenting, and generally being awesome. As always I’m happy to get feedback or suggestions! Enjoy!

-Kenji


Salesforce True Deep Clone, the (Im)Possible Dream

So getting back to work work (sorry alexa/amazon/echo, I’ve gotta pay for more smart devices somehow), I’ve been working on a project where there is a fairly in depth hierarchy of records. We will call them surveys, these surveys have records related to them. Those records have other records related to them, and so on. It’s a semi complicated “tree” that goes about 5 levels deep with different kinds of objects in each “branch”. Of course with such a complicated structure, but a common need to copy and modify it for a new project, the request for a better clone came floating across my desk. Now Salesforce does have a nice clone tool built  in, but it doesn’t have the ability to copy an entire hierarchy, and some preliminary searches didn’t turn up anything great either. The reason why, it’s pretty damn tricky, and governor limits can initially make it seem impossible. What I have here is an initial attempt at a ‘true deep clone’ function. You give it a record (or possibly list of records, but I wouldn’t push your luck) to clone. It will do that, and then clone then children, and re-parent them to your new clone. It will then find all those records children and clone and re-parent them as well, all the way down. Without further ado, here is the code.

    //clones a batch of records. Must all be of the same type.
    //very experemental. Small jobs only!
    public  Map<String, Schema.SObjectType> globalDescribeMap = Schema.getGlobalDescribe();    
    public static list<sObject> deepCloneBatched(list<sObject> objectsToClone) { return deepCloneBatched(objectsToClone,new map<id,id>());}
    public static list<sObject> deepCloneBatched(list<sObject> objectsToClone, map<id,id> previousSourceToCloneMap)
    {
        system.debug('\n\n\n--------------------  Cloning record ' + objectsToClone.size() + ' records');
        list<id> objectIds = new list<id>();
        list<sobject> clones = new list<sobject>();
        list<sObject> newClones = new list<sObject>();
        map<id,id> sourceToCloneMap = new map<id,id>();
        
        
        if(objectsToClone.isEmpty())
        {
            system.debug('\n\n\n-------------------- No records in set to clone. Aborting');
            return clones;
        }
                
        //if this function has been called recursively, then the previous batch of cloned records
        //have not been inserted yet, so now they must be before we can continue. Also, in that case
        //because these are already clones, we do not need to clone them again, so we can skip that part
        if(objectsToClone[0].Id == null)
        {
            //if they don't have an id that means these records are already clones. So just insert them with no need to clone beforehand.
            insert objectsToClone;
            clones.addAll(objectsToClone);
            
            for(sObject thisClone : clones)
            {
                sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
            }
                        
            objectIds.addAll(new list<id>(previousSourceToCloneMap.keySet()));
            //get the ids of all these objects.                    
        }
        else
        {
            //get the ids of all these objects.
            for(sObject thisObj :objectsToClone)
            {
                objectIds.add(thisObj.Id);
            }
            
            for(sObject thisObj : objectsToClone)
            {
                sObject clonedObject = thisObj.clone(false,true,false,false);
                clones.add(clonedObject);               
            }    
            
            //insert the clones
            insert clones;
            
            for(sObject thisClone : clones)
            {
                sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
            }
        }        

        //figure out what kind of object we are dealing with
        string relatedObjectType = objectsToClone[0].Id.getSobjectType().getDescribe().getName();
        
        //Describes this object type so we can deduce it's child relationships
        Schema.DescribeSObjectResult objectDescribe = globalDescribeMap.get(relatedObjectType).getDescribe();
                    
        //get this objects child relationship types
        List<Schema.ChildRelationship> childRelationships = objectDescribe.getChildRelationships();

        system.debug('\n\n\n-------------------- ' + objectDescribe.getName() + ' has ' + childRelationships.size() + ' child relationships');
        
        //then have to iterate over every child relationship type, and every record of that type and clone them as well. 
        for(Schema.ChildRelationship thisRelationship : childRelationships)
        { 
                      
            Schema.DescribeSObjectResult childObjectDescribe = thisRelationship.getChildSObject().getDescribe();
            string relationshipField = thisRelationship.getField().getDescribe().getName();
            
            try
            {
                system.debug('\n\n\n-------------------- Looking at ' + childObjectDescribe.getName() + ' which is a child object of ' + objectDescribe.getName());
                
                if(!childObjectDescribe.isCreateable() || !childObjectDescribe.isQueryable() || !childObjectDescribe.isCustom())
                {
                    system.debug('-------------------- Object is not one of the following: queryable, creatable, or custom. Skipping attempting to clone this object');
                    continue;
                }
                if(Limits.getQueries() >= Limits.getLimitQueries())
                {
                    system.debug('\n\n\n-------------------- Governor limits hit. Must abort.');
                    return clones;
                }
                //create a select all query from the child object type
                string childDataQuery = buildSelectAllStatment(childObjectDescribe.getName());
                
                //add a where condition that will only find records that are related to this record. The field which the relationship is defined is stored in the maps value
                childDataQuery+= ' where '+relationshipField+ ' in :objectIds';
                
                //get the details of this object
                list<sObject> childObjectsWithData = database.query(childDataQuery);
                
                if(!childObjectsWithData.isEmpty())
                {               
                    map<id,id> childRecordSourceToClone = new map<id,id>();
                    
                    for(sObject thisChildObject : childObjectsWithData)
                    {
                        childRecordSourceToClone.put(thisChildObject.Id,null);
                        
                        //clone the object
                        sObject newClone = thisChildObject.clone();
                        
                        //since the record we cloned still has the original parent id, we now need to update the clone with the id of it's cloned parent.
                        //to do that we reference the map we created above and use it to get the new cloned parent.                        
                        system.debug('\n\n\n----------- Attempting to change parent of clone....');
                        id newParentId = sourceToCloneMap.get((id) thisChildObject.get(relationshipField));
                        
                        system.debug('Old Parent: ' + thisChildObject.get(relationshipField) + ' new parent ' + newParentId);
                        
                        //write the new parent value into the record
                        newClone.put(thisRelationship.getField().getDescribe().getName(),newParentId );
                        
                        //add this new clone to the list. It will be inserted once the deepClone function is called again. I know it's a little odd to not just insert them now
                        //but it save on redudent logic in the long run.
                        newClones.add(newClone);             
                    }  
                    //now we need to call this function again, passing in the newly cloned records, so they can be inserted, as well as passing in the ids of the original records
                    //that spawned them so the next time the query can find the records that currently exist that are related to the kind of records we just cloned.                
                    clones.addAll(deepCloneBatched(newClones,childRecordSourceToClone));                                  
                }                    
            }
            catch(exception e)
            {
                system.debug('\n\n\n---------------------- Error attempting to clone child records of type: ' + childObjectDescribe.getName());
                system.debug(e); 
            }            
        }
        
        return clones;
    }
     
    /**
    * @description create a string which is a select statment for the given object type that will select all fields. Equivilent to Select * from objectName ins SQL
    * @param objectName the API name of the object which to build a query string for
    * @return string a string containing the SELECT keyword, all the fields on the specified object and the FROM clause to specify that object type. You may add your own where statments after.
    **/
    public static string buildSelectAllStatment(string objectName){ return buildSelectAllStatment(objectName, new list<string>());}
    public static string buildSelectAllStatment(string objectName, list<string> extraFields)
    {       
        // Initialize setup variables
        String query = 'SELECT ';
        String objectFields = String.Join(new list<string>(Schema.getGlobalDescribe().get(objectName).getDescribe().fields.getMap().keySet()),',');
        if(extraFields != null)
        {
            objectFields += ','+String.Join(extraFields,',');
        }
        
        objectFields = objectFields.removeEnd(',');
        
        query += objectFields;
    
        // Add FROM statement
        query += ' FROM ' + objectName;
                 
        return query;   
    }

You should be able to just copy and paste that into a class, invoke the deepCloneBatched method with the record you want to clone, and it should take care of the rest, cloning every related record that it can. It skips non custom objects for now (because I didn’t need them) but you can adjust that by removing the if condition at line 81 that says

|| !childObjectDescribe.isCustom()

And then it will also clone all the standard objects it can. Again this is kind of a ‘rough draft’ but it does seem to be working. Even cloning 111 records of several different types, I was still well under all governor limits. I’d explain more about how it works, but the comments are there, it’s 3:00 in the morning and I’m content to summarize the workings of by shouting “It’s magic. Don’t question it”, and walking off stage. Let me know if you have any clever ways to make it more efficient, which I have no doubt there is. Anyway, enjoy. I hope it helps someone out there.


Entity is deleted on apex merge

Hey guys,

Just a little quick fix post here, a silly little bug that took me a bit of time to hunt down (probably just because I hadn’t had enough coffee yet). Anyway, the error happens when trying to merge two accounts together. I was getting the error ‘entity is deleted’. The only thing that made my code any different from other examples was that, the account I was trying to merge was being selected by picking it from a lookup on the master.  The basic code looked like this (masterAccount was being set by the constructor for the class, so it is already setup properly).

            try
            {
                Account subAccount = new Account(id=masterAccount.Merge_With__c);
                merge masterAccount subAccount;
                mergeResult = 'Merge successful';
            }
            catch(exception e)
            {
                mergeResult = e.getMessage();
            }

Can you spot the problem here? Yup, because the Merge_With__c field on the master account would now be referencing an account that doesn’t exist (since after a merge the child records get removed) it was throwing that error. So simple once you realize it. Of course the fix for it is pretty easy as well. Just null out the lookup field before the merge call.

            try
            {
                Account subAccount = new Account(id=masterAccount.Merge_With__c);
                masterAccount.Merge_With__c = null;
                merge masterAccount subAccount;
                mergeResult = 'Merge successful';
            }
            catch(exception e)
            {
                mergeResult = e.getMessage();
            }

There you have it. I realize this is probably kind of a ‘duh’ post but it had me stumped for a few minutes, and I’m mostly just trying to get back into the swing of blogging more regularly, so I figured I’d start with something easy. ‘Till next time!




Angel IVR REST API wrapper for Salesforce Apex

Hey all,

Just a random post to help out any developers who may be trying to use the Angel IVR outbound calling features of their new REST API. This is a wrapper class that should do all the hard work for you. It handles all the HTTP traffic, batching, parsing of responses and serialization for ya. You’ll need to create a custom setting called Angel IVR Site and store your API token, API endpoint, subscriber Id in there (or just change the references to settings.whatever in the code to be hard coded. The test class shows creation of one of these objects, along with the expected fields names.

Here is the code. I’ll probably post up a sample app later and maybe even an installable package. I just wanted to get this out there before I forget, or get too lazy to do anything else with it.

/*Angel IVR API Wrapper
Description: Simple class for placing calls via the Angel IVR REST API. Also has experemental
implementations of the other API calls, including cancel, and request (gets job status)

See
https://www.socialtext.net/ivrwiki/outbound_rest_api_documentation
for API details.

Author: Daniel Llewellyn (Twitter: @Kenji776)
Date: 11/16/2012
*/

global class angelIVRWrapper
{
    class applicationException extends Exception {}

    public Angel_IVR_Site__c settings = Angel_IVR_Site__c.getValues('Prod');
    global static boolean isTest = false;

    //a single call item to place to angel. You will always pass a list of these. To make calls create a list
    //of these things (one for each person you wish to call if the call has variables unique to each person, or a single callitem with all the phone numbers included if
    //they are not unique). Then pass them into the campaignCall function along with the angel site you wish to use.
    global class callItem
    {
        public integer maxWaitTime = 30;
        public string phoneNumbers = '';
        public map<string,string> variables = new map<string,string>();
    }

    //a generic API response container. Will contain any error messages and status codes if something bombs.
    //otherwise it should contain a jobId you can later use for cancelling, checking status, etc. Also contains
    //a list of all the call items we attempted to place calls from, all the call items that where skipped (due to being invalid for some resaon)
    //and a list of the responses as provided by angel.
    global class callResponse
    {
        public string jobId = '';
        public string message = 'Calls Placed Successfully';
        public integer httpResponseCode = 200;
        public string httpResponse = 'ok';
        public boolean success = true;
        public list<callRequest> callRequestResponses = new list<callRequest>();
        public list<callItem> placedRequests = new list<callItem>();
        public list<callItem> skippedRequests = new list<callItem>();
    }

    //return object type from the API that contains details about a single call placed using the outbound API.
    global class callRequest
    {
        public string code = '';
        public string callStartTime = '';
        public string callEndTime = '';
        public string phonenumber='';
        public string phoneLineRequestID='';
        public string message = '';
    }    

    @RemoteAction
    global static list<callResponse> campaignCall(list<callitem> callItems, string angelSite)
    {
        angelIVRWrapper controller = new angelIVRWrapper();
        return controller.campaignCall(callItems , angelSite, true);
    }

    //wrapper for the campaignCall function of the Angel IVR. Pass it a list of call items, one for each person you wish to call.
    //it will return a call response object which should contain the job Id you can use for getting the status later.
    public list<callResponse> campaignCall(list<callitem> callItems, string angelSite, boolean allowPartial)
    {
        list<FeedItem> posts = new list<FeedItem>();
        set<string> phoneNumbers = new set<string>();

        //experemental idea for breaking a large number of call items into batches, delayed
        //by a number of seconds. This will likely have to use a recursive scheduled job or something.
        integer batchDelaySeconds = 600;

        list<callResponse> res = new list<callResponse>();
        map<string,string> params = new map<string,string>();
        params.put('allowPartial',string.valueOf(allowPartial));     

        //ask the broker to make a call to angel using the campaignCalls method, passing in the list of call items, and set the allowPartial 
        //url param as well.
        integer counter = 0;
        list<callItem> thisBatch = new list<callItem>();
        list<callItem> skippedRequests = new list<callItem>();

        //break the overall list into batches of 250, since that is the maximum amount you can place in one request.
        for(callItem callItem : callItems)
        {  
            counter++;
            if(callItem.variables.size() <= 50 && !phoneNumbers.contains(callItem.phoneNumbers))
            {              
                thisBatch.add(callItem); 
                phoneNumbers.add(callItem.phoneNumbers);

                if(callitem.variables.containsKey('ID'))
                {
                    FeedItem post = new FeedItem();
                    post.ParentId = callitem.variables.get('ID'); 
                    post.Body = 'Placed outbound call to this record using phone number '+callItem.phoneNumbers+' to Angel site ' + angelSite;    
                    posts.add(post);        
                }                       
            }
            //if this call item has too many variables, or we already have a call for this phone number in the queue, add the call item to our list
            //of skipped calls.
            else
            {
                skippedRequests.add(callItem);
            }

            if(thisBatch.size() == 250 || counter == callItems.size())
            {
                callResponse thisResponse = brokerRequest('POST','campaignCalls',thisBatch, angelSite, params);
                thisResponse.skippedRequests.addAll(skippedRequests);
                res.add(thisResponse);  
                thisBatch.clear();  
                skippedRequests.clear();    
            }
        }
        insert posts;
        return res;
    }

    //wrapper for the requests function of the Angel IVR. Pass it a job id and get back the current status of that job.
    public callResponse requests(string jobId)
    {        
        //ask the broker to make a call to angel using the campaignCalls method, passing in the list of call items, and set the allowPartial 
        //url param as well.  
        callResponse res = brokerRequest('GET','requests/job/'+jobId,null,null,null);        
        return res;
    } 

    //wrapper for the cancels function of the Angel IVR. Pass it a job id and that job will be cancelld if it is currently queued.
    public callResponse cancels(string jobId)
    {
        //ask the broker to make a call to angel using the campaignCalls method, passing in the list of call items, and set the allowPartial 
        //url param as well.  
        callResponse res = brokerRequest('GET','cancels/job/'+jobId,null,null,null);        
        return res;    
    }

    //handles the actual sending of http requests, handling of the response, formatting, etc.
    public callResponse brokerRequest(string httpVerb, string method, list<callitem> callitems, string angelSite, map<string,string> urlParams)
    {
        //create a call response object to pass back.
        callResponse callResponse = new callResponse();

        string requestURI;
        //create the endpoint URI with a bit of string concatination.
        if(angelSite !=null)
        {
            requestURI = settings.API_Endpoint__c+'/'+settings.Subscriber_ID__c+'/'+angelSite+'/'+method+'?apiKey='+settings.API_Key__c;
        }
        else
        {
            requestURI = settings.API_Endpoint__c+'/'+settings.Subscriber_ID__c+'/'+method+'?apiKey='+settings.API_Key__c;
        }    
        HttpRequest req = new HttpRequest();
        //setup the http request.
        req.setMethod(httpVerb);  
        req.setHeader('Content-Type','application/xml');     

        if(urlParams !=null)
        {
            for(string param : urlParams.keySet())
            {
                requestURI += '&'+param+'='+urlParams.get(param);
            }
        }
        req.setEndpoint(requestURI);

        if(callItems != null)
        {    
            //generating Angel XML using the serializer and set that as the request body.                  
            req.setBody(serializeCallItemAngelXml(callitems));
        }
        //send http request
        Http http = new Http();
        HttpResponse res = new HttpResponse();
        string responseBody;
        try
        {
            //some handling for if this is a test class or not. Can't make outbound calls in tests, so we need a mock response
            //if its a test.
            if(!isTest)
            {
                res = http.send(req); 
                responseBody = res.getBody();
            }
            else
            {
                responseBody = '<?xml version="1.0" encoding="UTF-8" standalone="yes"?><outboundRequest jobID="0a14021a-1c-13b0b0b3e8a-7abc2f20-d09"><callRequest><requestDetails number="9522206974" phoneLineRequestID="200158136763"/><attempt callEndTime="" callStartTime="" code="queued"><message>queued</message></attempt></callRequest><timeCreated>2012-11-16T16:06:24.331-05:00</timeCreated></outboundRequest>';           
            }

            //if the http response doesn't have a 200 response code, an error happened, so we gotta toss an error, set the success to false, etc.
            //trying to parse whatever is in the body of the http request would probably cause an error as well, so we want to avoid doing that.

            if(res.getStatusCode() != 200 && !isTest)
            {
                throw new applicationException('Error during HTTP Request. Status Code:' + res.getStatusCode() + ' Status:' + res.getStatus() + ' Body Text: ' + res.getBody());
            }

            //if all has gone well until now, parse the results of the http request into a callResponse object and pass that back. This will
            //contain the job id and status of the call(s)
            callResponse = deserializeAngelOutboundRequestXML(responseBody);          
        }
        catch(exception e)
        {          
            callResponse.success = false;
            callResponse.message = e.getMessage() + ' on line: ' + e.getLineNumber() + '. Root cause: ' + e.getCause();   
            callResponse.httpResponseCode = res.getStatusCode();   
            callResponse.httpResponse = res.getStatus();

        }   
        callResponse.placedRequests.addAll(callItems);             
        return callResponse;           
    }

    //takes a list of callItem objects and turns them into valid AngelXML to send to the API. I wish
    //this was more dynamic (using some kind of reflection to iterate over the object properties, but whatever).
    //remember you can include variables in the callItem object to customize the information sent to Angel for each 
    //particular call.
    public static string serializeCallItemAngelXml(list<callitem> callitems)
    {
        string angelXML = '<callItems>';

        for(callitem thisCallItem : callItems)
        {
            angelXML += '<callItem>';
            angelXML += '<maxWaitTime>' + thisCallItem.maxWaitTime + '</maxWaitTime>';
            angelXML += '<phoneNumbers>' + thisCallItem.phoneNumbers + '</phoneNumbers>';
            for(string thisVar : thisCallItem.variables.keySet())
            {
                angelXML += '<variables><name>'+thisVar+'</name>';
                angelXML += '<value>'+thisCallItem.variables.get(thisVar)+'</value></variables>';
            }
            angelXML += '</callItem>';
        }

        angelXML += '</callItems>';
        return angelXML;
    }

    public static callResponse deserializeAngelOutboundRequestXML(string angelXMLResponse)
    {
        Xmlstreamreader reader = new Xmlstreamreader(angelXMLResponse);

        callResponse thisResponse = new callResponse();
        callRequest thisRequest;

        while (reader.hasNext()) 
        { 
            if(reader.getEventType() == XmlTag.START_ELEMENT && reader.getLocalName() == 'outboundRequest')
            {
                thisResponse.jobId = reader.getAttributeValue(null,'jobID'); 
            }  
            if(reader.getEventType() == XmlTag.START_ELEMENT && reader.getLocalName() == 'callRequest')
            {
                thisRequest = new callRequest();
            }            
            else if(reader.getEventType() == XmlTag.START_ELEMENT && reader.getLocalName() == 'requestDetails')
            {
                thisRequest.phonenumber = reader.getAttributeValue(null,'number');
                thisRequest.phoneLineRequestID = reader.getAttributeValue(null,'phoneLineRequestID');
            }
            else if(reader.getEventType() == XmlTag.START_ELEMENT && reader.getLocalName() == 'attempt')
            {
                thisRequest.callStarttime = reader.getAttributeValue(null,'callStartTime');
                thisRequest.callEndtime = reader.getAttributeValue(null,'callEndTime');
                thisRequest.code = reader.getAttributeValue(null,'code');
            }

            else if(reader.getEventType() == XmlTag.START_ELEMENT && reader.getLocalName() == 'message')
            {
                reader.next();
                thisRequest.message = getDecodedString(reader);
            }
            else if(reader.getEventType() == XmlTag.END_ELEMENT && reader.getLocalName() == 'attempt')
            {
                thisResponse.callRequestResponses.add(thisRequest);
            }            
            reader.next();
        }

        return thisResponse;
    }

    public static String getDecodedString(Xmlstreamreader reader)
    {
        return EncodingUtil.urlDecode(reader.getText(), 'UTF-8').trim();
    }

    @isTest
    public static void angelIVRWrapper()
    {
        isTest = true;

        //weird workaround to avoid mixed dml error, as seen here
        //http://boards.developerforce.com/t5/Apex-Code-Development/DML-not-allowed-on-user-in-test-context/m-p/98393
        User thisUser = [ select Id from User where Id = :UserInfo.getUserId() ];
            System.runAs ( thisUser ) {
            Angel_IVR_Site__c settings = new Angel_IVR_Site__c();
            settings.name = 'Prod';
            settings.Angel_Site_Id__c = 'test';
            settings.API_Endpoint__c = 'http://www.test.com';
            settings.API_Key__c = 'test api key';
            settings.Subscriber_ID__c = '402342';
            insert settings;
        }

        angelIVRWrapper controller = new angelIVRWrapper();
        list<angelIVRWrapper.callitem> callItems = new list<angelIVRWrapper.callitem>();
        Respondent__c testRespondent = testDataGenerator.createTestRespondent();

        angelIVRWrapper.callitem thisCallItem = new angelIVRWrapper.callItem();
        thisCallItem.phoneNumbers = '5555555555';
        thisCallItem.variables.put('RESPONDENT__R_NAME','Frank Jones');
        thisCallItem.variables.put('ID',testRespondent.id);

        callItems.add(thisCallItem);

        callResponse requestStatus = controller.requests(response[0].jobId);

        callResponse sendCancel = controller.cancels(response[0].jobId);

        list<callResponse> sendCalls = angelIVRWrapper.campaignCall(callItems, '200000124604');
    }
}