Oh my god. It's full of code!

Latest

Dynamic Apex Invocation/Callbacks

So I’ve been working on that DeepClone class and it occurred to me that whatever invokes that class might like to know when the process is done (so maybe it can do something with those created records). Seeing as the DeepClone is by it’s very nature asynchronous that presents a problem, since the caller cannot sit and wait for process to complete. You know what other language has to deal with async issues a lot? Javascript. In Javascript we often solve this problem with a ‘callback’ function (I know callbacks are old and busted, promises are the new hotness but bare with me here), where in you call your asynchronous function and tell it what to call when it’s done. Most often that is done by passing in the actual function code instead of just the name, but both are viable. Here is an example of what both might look like.

var someData = 'data to give to async function';

//first type of invocation passes in an actual function as the callback. 
asyncThing(someData,function(result){
	console.log('I passed in a function directly!' + result);
});

//second type of invocation passes in the name of a function to call instead
asyncThing(someData,'onCompleteHandler');

function onCompleteHandler(result)
{
	console.log('I passed in the name of a function to call and that happened' + result);
}

function asyncThing(data,callback)
{
	//async code here, maybe a callout or something.
	var data = 'probably  a status code or the fetched data would go here';
	
	//if our callback is a function, then just straight up invoke it
	if(typeof callback == 'function')
	{
		callback(data);
	}
	//if our callback is a string, then dynamically invoke it
	else if(typeof callback == 'string')
	{
		window[callback](data);
	}
}

So yeah, javascript is cool, it has callbacks. What does this have to do with Apex? Apex is strongly typed, you can’t just go around passing around functions as arguments, and you sure as hell can’t do dynamic invocation… or can you? Behold, by abusing the tooling api, I give you a basic implementation of a dynamic Apex callback!

public HttpResponse invokeCallback(string callback, string dataString)
{
	HttpResponse res = new HttpResponse();
	try
	{
		string functionCall = callback+'(\''+dataString,',')+'\');';
		HttpRequest req = new HttpRequest();
		req.setHeader('Authorization', 'Bearer ' + UserInfo.getSessionID());
		req.setHeader('Content-Type', 'application/json');
		string instanceURL = System.URL.getSalesforceBaseUrl().getHost().remove('-api' ).toLowerCase();
		String toolingendpoint = 'https://'+instanceURL+'/services/data/v28.0/tooling/executeAnonymous/?anonymousBody='+encodingUtil.urlEncode(functionCall,'utf-8');
		req.setEndpoint(toolingendpoint);
		req.setMethod('GET');
		
		Http h = new Http();
		res = h.send(req);
	}
	catch(exception e)
	{
		system.debug('\n\n\n\n--------------------- Error attempting callback!');
		system.debug(e);
		system.debug(res);
	}
	return res;
} 

What’s going on here? The Tooling API allows us to execute anonymous code. Normally the Tooling API is for external tools/languages to access Salesforce meta-data and perform operations. However, by accessing it via REST and passing in both the name of a class and method, and properly encoding any data you’d like to pass (strings only, no complex object types) you can provide a dynamic callback specified at runtime. We simply create a get request against the Tooling API REST endpoint, and invoke the execute anonymous method. Into that we pass the desired callback function name. So now when DeepClone for example is instantiated the caller can set a class level property of class and method it would like called when DeepClone is done doing it’s thing. It can pass back all the Id’s of the records created so then any additional work can be performed. Of course the class provided has to be public, and the method called must be static. Additionally you have to add your own org id to the allowed remote sites under security->remote site settings. Anyway, I thought this was a pretty nice way of letting your @future methods and your queueable methods to pass information back to a class so you aren’t totally left in the dark about what the results were. Enjoy!

Deep Clone (Round 2)

So a day or two ago I posted my first draft of a deep clone, which would allow easy cloning of an entire data hierarchy. It was a semi proof of concept thing with some limitations (it could only handle somewhat smaller data sets, and didn’t let you configure all or nothing inserts, or specify if you wanted to copy standard objects as well as custom or not). I was doing some thinking and I remembered hearing about the queueable interface, which allows for asynchronous processing and bigger governor limits. I started thinking about chaining queueable jobs together to allow for copying much larger data sets. Each invocation would get it’s own governor limits and could theoretically go on as long as it took since you can chain jobs infinitely. I had attempted to use queueable to solve this before but i made the mistake of trying to kick off multiple jobs per invocation (one for each related object type). This obviously didn’t work due to limits imposed on queueable. Once I thought of a way to only need one invocation per call (basically just rolling all the records that need to get cloned into one object and iterate over it) I figured I might have a shot at making this work. I took what I had written before, added a few options, and I think I’ve done it. An asynchronous deep clone that operates in distinct batches with all or nothing handling, and cleanup in case of error. This is some hot off the presses code, so there is likely some lingering bugs, but I was too excited not to share this. Feast your eyes!

public class deepClone implements Queueable {

    //global describe to hold object describe data for query building and relationship iteration
    public map<String, Schema.SObjectType> globalDescribeMap = Schema.getGlobalDescribe();
    
    //holds the data to be cloned. Keyed by object type. Contains cloneData which contains the object to clone, and some data needed for queries
    public map<string,cloneData> thisInvocationCloneMap = new map<string,cloneData>();
    
    //should the clone process be all or nothing?
    public boolean allOrNothing = false;
    
    //each iteration adds the records it creates to this property so in the event of an error we can roll it all back
    public list<id> allCreatedObjects = new list<id>();
    
    //only clone custom objects. Helps to avoid trying to clone system objects like chatter posts and such.
    public boolean onlyCloneCustomObjects = true;
    
    public static id clone(id sObjectId, boolean onlyCustomObjects, boolean allOrNothing)
    {
        
        deepClone startClone= new deepClone();
        startClone.onlyCloneCustomObjects  = onlyCustomObjects;
        startClone.allOrNothing = allOrNothing;
        
        sObject thisObject = sObjectId.getSobjectType().newSobject(sObjectId);
        cloneData thisClone = new cloneData(new list<sObject>{thisObject}, new map<id,id>());
        map<string,cloneData> cloneStartMap = new map<string,cloneData>();
        
        cloneStartMap.put(sObjectId.getSobjectType().getDescribe().getName(),thisClone);
        
        startClone.thisInvocationCloneMap = cloneStartMap;
        return System.enqueueJob(startClone);
        
        return null;      
    }
    
    public void execute(QueueableContext context) {
        deepCloneBatched();
    }
        
    /**
    * @description Clones an object and the entire related data hierarchy. Currently only clones custom objects, but enabling standard objects is easy. It is disabled because it increases risk of hitting governor limits
    * @param sObject objectToClone the root object be be cloned. All descended custom objects will be cloned as well
    * @return list<sobject> all of the objects that were created during the clone.
    **/
    public list<id> deepCloneBatched()
    {
        map<string,cloneData> nextInvocationCloneMap = new map<string,cloneData>();
        
        //iterate over every object type in the public map
        for(string relatedObjectType : thisInvocationCloneMap.keySet())
        { 
            list<sobject> objectsToClone = thisInvocationCloneMap.get(relatedObjectType).objectsToClone;
            map<id,id> previousSourceToCloneMap = thisInvocationCloneMap.get(relatedObjectType).previousSourceToCloneMap;
            
            system.debug('\n\n\n--------------------  Cloning record ' + objectsToClone.size() + ' records');
            list<id> objectIds = new list<id>();
            list<sobject> clones = new list<sobject>();
            list<sObject> newClones = new list<sObject>();
            map<id,id> sourceToCloneMap = new map<id,id>();
            list<database.saveresult> cloneInsertResult;
                       
            //if this function has been called recursively, then the previous batch of cloned records
            //have not been inserted yet, so now they must be before we can continue. Also, in that case
            //because these are already clones, we do not need to clone them again, so we can skip that part
            if(objectsToClone[0].Id == null)
            {
                //if they don't have an id that means these records are already clones. So just insert them with no need to clone beforehand.
                cloneInsertResult = database.insert(objectsToClone,allOrNothing);

                clones.addAll(objectsToClone);
                
                for(sObject thisClone : clones)
                {
                    sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
                }
                            
                objectIds.addAll(new list<id>(previousSourceToCloneMap.keySet()));
                //get the ids of all these objects.                    
            }
            else
            {
                //get the ids of all these objects.
                for(sObject thisObj :objectsToClone)
                {
                    objectIds.add(thisObj.Id);
                }
    
                //create a select all query to get all the data for these objects since if we only got passed a basic sObject without data 
                //then the clone will be empty
                string objectDataQuery = buildSelectAllStatment(relatedObjectType);
                
                //add a where condition
                objectDataQuery += ' where id in :objectIds';
                
                //get the details of this object
                list<sObject> objectToCloneWithData = database.query(objectDataQuery);
    
                for(sObject thisObj : objectToCloneWithData)
                {              
                    sObject clonedObject = thisObj.clone(false,true,false,false);
                    clones.add(clonedObject);               
                }    
                
                //insert the clones
                cloneInsertResult = database.insert(clones,allOrNothing);
                
                for(sObject thisClone : clones)
                {
                    sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
                }
            }        
            
            for(database.saveResult saveResult :  cloneInsertResult)
            {
                if(saveResult.success)
                {
                    allCreatedObjects.add(saveResult.getId());
                }
                else if(allOrNothing)
                {
                    cleanUpError();
                    return allCreatedObjects;
                }
            }
              
            //Describes this object type so we can deduce it's child relationships
            Schema.DescribeSObjectResult objectDescribe = globalDescribeMap.get(relatedObjectType).getDescribe();
                        
            //get this objects child relationship types
            List<Schema.ChildRelationship> childRelationships = objectDescribe.getChildRelationships();
    
            system.debug('\n\n\n-------------------- ' + objectDescribe.getName() + ' has ' + childRelationships.size() + ' child relationships');
            
            //then have to iterate over every child relationship type, and every record of that type and clone them as well. 
            for(Schema.ChildRelationship thisRelationship : childRelationships)
            { 
                          
                Schema.DescribeSObjectResult childObjectDescribe = thisRelationship.getChildSObject().getDescribe();
                string relationshipField = thisRelationship.getField().getDescribe().getName();
                
                try
                {
                    system.debug('\n\n\n-------------------- Looking at ' + childObjectDescribe.getName() + ' which is a child object of ' + objectDescribe.getName());
                    
                    if(!childObjectDescribe.isCreateable() || !childObjectDescribe.isQueryable())
                    {
                        system.debug('-------------------- Object is not one of the following: queryable, creatable. Skipping attempting to clone this object');
                        continue;
                    }
                    if(onlyCloneCustomObjects && !childObjectDescribe.isCustom())
                    {
                        system.debug('-------------------- Object is not custom and custom object only clone is on. Skipping this object.');
                        continue;                   
                    }
                    if(Limits.getQueries() >= Limits.getLimitQueries())
                    {
                        system.debug('\n\n\n-------------------- Governor limits hit. Must abort.');
                        
                        //if we hit an error, and this is an all or nothing job, we have to delete what we created and abort
                        if(!allOrNothing)
                        {
                            cleanUpError();
                        }
                        return allCreatedObjects;
                    }
                    //create a select all query from the child object type
                    string childDataQuery = buildSelectAllStatment(childObjectDescribe.getName());
                    
                    //add a where condition that will only find records that are related to this record. The field which the relationship is defined is stored in the maps value
                    childDataQuery+= ' where '+relationshipField+ ' in :objectIds';
                    
                    //get the details of this object
                    list<sObject> childObjectsWithData = database.query(childDataQuery);
                    
                    system.debug('\n\n\n-------------------- Object queried. Found ' + childObjectsWithData.size() + ' records to clone');
                    
                    if(!childObjectsWithData.isEmpty())
                    {               
                        map<id,id> childRecordSourceToClone = new map<id,id>();
                        
                        for(sObject thisChildObject : childObjectsWithData)
                        {
                            childRecordSourceToClone.put(thisChildObject.Id,null);
                            
                            //clone the object
                            sObject newClone = thisChildObject.clone();
                            
                            //since the record we cloned still has the original parent id, we now need to update the clone with the id of it's cloned parent.
                            //to do that we reference the map we created above and use it to get the new cloned parent.                        
                            system.debug('\n\n\n----------- Attempting to change parent of clone....');
                            id newParentId = sourceToCloneMap.get((id) thisChildObject.get(relationshipField));
                            
                            system.debug('Old Parent: ' + thisChildObject.get(relationshipField) + ' new parent ' + newParentId);
                            
                            //write the new parent value into the record
                            newClone.put(thisRelationship.getField().getDescribe().getName(),newParentId );
                            
                            //add this new clone to the list. It will be inserted once the deepClone function is called again. I know it's a little odd to not just insert them now
                            //but it save on redudent logic in the long run.
                            newClones.add(newClone);             
                        }  
                        cloneData thisCloneData = new cloneData(newClones,childRecordSourceToClone);
                        nextInvocationCloneMap.put(childObjectDescribe.getName(),thisCloneData);                             
                    }                                       
                       
                }
                catch(exception e)
                {
                    system.debug('\n\n\n---------------------- Error attempting to clone child records of type: ' + childObjectDescribe.getName());
                    system.debug(e); 
                }            
            }          
        }
        
        system.debug('\n\n\n-------------------- Done iterating cloneable objects.');
        
        system.debug('\n\n\n-------------------- Clone Map below');
        system.debug(nextInvocationCloneMap);
        
        system.debug('\n\n\n-------------------- All created object ids thus far across this invocation');
        system.debug(allCreatedObjects);
        
        //if our map is not empty that means we have more records to clone. So queue up the next job.
        if(!nextInvocationCloneMap.isEmpty())
        {
            system.debug('\n\n\n-------------------- Clone map is not empty. Sending objects to be cloned to another job');
            
            deepClone nextIteration = new deepClone();
            nextIteration.thisInvocationCloneMap = nextInvocationCloneMap;
            nextIteration.allCreatedObjects = allCreatedObjects;
            nextIteration.onlyCloneCustomObjects  = onlyCloneCustomObjects;
            nextIteration.allOrNothing = allOrNothing;
            id  jobId = System.enqueueJob(nextIteration);       
            
            system.debug('\n\n\n-------------------- Next queable job scheduled. Id is: ' + jobId);  
        }
        
        system.debug('\n\n\n-------------------- Cloneing Done!');
        
        return allCreatedObjects;
    }
     
    /**
    * @description create a string which is a select statement for the given object type that will select all fields. Equivalent to Select * from objectName ins SQL
    * @param objectName the API name of the object which to build a query string for
    * @return string a string containing the SELECT keyword, all the fields on the specified object and the FROM clause to specify that object type. You may add your own where statements after.
    **/
    public string buildSelectAllStatment(string objectName){ return buildSelectAllStatment(objectName, new list<string>());}
    public string buildSelectAllStatment(string objectName, list<string> extraFields)
    {       
        // Initialize setup variables
        String query = 'SELECT ';
        String objectFields = String.Join(new list<string>(globalDescribeMap.get(objectName).getDescribe().fields.getMap().keySet()),',');
        if(extraFields != null)
        {
            objectFields += ','+String.Join(extraFields,',');
        }
        
        objectFields = objectFields.removeEnd(',');
        
        query += objectFields;
    
        // Add FROM statement
        query += ' FROM ' + objectName;
                 
        return query;   
    }    
    
    public void cleanUpError()
    {
        database.delete(allCreatedObjects);
    }
    
    public class cloneData
    {
        public list<sObject> objectsToClone = new list<sObject>();        
        public map<id,id> previousSourceToCloneMap = new map<id,id>();  
        
        public cloneData(list<sObject> objects, map<id,id> previousDataMap)
        {
            this.objectsToClone = objects;
            this.previousSourceToCloneMap = previousDataMap;
        }   
    }    
}    

 

It’ll clone your record, your records children, your records children’s children’s, and yes even your records children’s children’s children (you get the point)! Simply invoke the deepClone.clone() method with the id of the object to start the clone process at, whether you want to only copy custom objects, and if you want to use all or nothing processing. Deep Clone takes care of the rest automatically handling figuring out relationships, cloning, re-parenting, and generally being awesome. As always I’m happy to get feedback or suggestions! Enjoy!

-Kenji

Salesforce True Deep Clone, the (Im)Possible Dream

So getting back to work work (sorry alexa/amazon/echo, I’ve gotta pay for more smart devices somehow), I’ve been working on a project where there is a fairly in depth hierarchy of records. We will call them surveys, these surveys have records related to them. Those records have other records related to them, and so on. It’s a semi complicated “tree” that goes about 5 levels deep with different kinds of objects in each “branch”. Of course with such a complicated structure, but a common need to copy and modify it for a new project, the request for a better clone came floating across my desk. Now Salesforce does have a nice clone tool built  in, but it doesn’t have the ability to copy an entire hierarchy, and some preliminary searches didn’t turn up anything great either. The reason why, it’s pretty damn tricky, and governor limits can initially make it seem impossible. What I have here is an initial attempt at a ‘true deep clone’ function. You give it a record (or possibly list of records, but I wouldn’t push your luck) to clone. It will do that, and then clone then children, and re-parent them to your new clone. It will then find all those records children and clone and re-parent them as well, all the way down. Without further ado, here is the code.

    //clones a batch of records. Must all be of the same type.
    //very experemental. Small jobs only!
    public  Map<String, Schema.SObjectType> globalDescribeMap = Schema.getGlobalDescribe();    
    public static list<sObject> deepCloneBatched(list<sObject> objectsToClone) { return deepCloneBatched(objectsToClone,new map<id,id>());}
    public static list<sObject> deepCloneBatched(list<sObject> objectsToClone, map<id,id> previousSourceToCloneMap)
    {
        system.debug('\n\n\n--------------------  Cloning record ' + objectsToClone.size() + ' records');
        list<id> objectIds = new list<id>();
        list<sobject> clones = new list<sobject>();
        list<sObject> newClones = new list<sObject>();
        map<id,id> sourceToCloneMap = new map<id,id>();
        
        
        if(objectsToClone.isEmpty())
        {
            system.debug('\n\n\n-------------------- No records in set to clone. Aborting');
            return clones;
        }
                
        //if this function has been called recursively, then the previous batch of cloned records
        //have not been inserted yet, so now they must be before we can continue. Also, in that case
        //because these are already clones, we do not need to clone them again, so we can skip that part
        if(objectsToClone[0].Id == null)
        {
            //if they don't have an id that means these records are already clones. So just insert them with no need to clone beforehand.
            insert objectsToClone;
            clones.addAll(objectsToClone);
            
            for(sObject thisClone : clones)
            {
                sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
            }
                        
            objectIds.addAll(new list<id>(previousSourceToCloneMap.keySet()));
            //get the ids of all these objects.                    
        }
        else
        {
            //get the ids of all these objects.
            for(sObject thisObj :objectsToClone)
            {
                objectIds.add(thisObj.Id);
            }
            
            for(sObject thisObj : objectsToClone)
            {
                sObject clonedObject = thisObj.clone(false,true,false,false);
                clones.add(clonedObject);               
            }    
            
            //insert the clones
            insert clones;
            
            for(sObject thisClone : clones)
            {
                sourceToCloneMap.put(thisClone.getCloneSourceId(),thisClone.Id);
            }
        }        

        //figure out what kind of object we are dealing with
        string relatedObjectType = objectsToClone[0].Id.getSobjectType().getDescribe().getName();
        
        //Describes this object type so we can deduce it's child relationships
        Schema.DescribeSObjectResult objectDescribe = globalDescribeMap.get(relatedObjectType).getDescribe();
                    
        //get this objects child relationship types
        List<Schema.ChildRelationship> childRelationships = objectDescribe.getChildRelationships();

        system.debug('\n\n\n-------------------- ' + objectDescribe.getName() + ' has ' + childRelationships.size() + ' child relationships');
        
        //then have to iterate over every child relationship type, and every record of that type and clone them as well. 
        for(Schema.ChildRelationship thisRelationship : childRelationships)
        { 
                      
            Schema.DescribeSObjectResult childObjectDescribe = thisRelationship.getChildSObject().getDescribe();
            string relationshipField = thisRelationship.getField().getDescribe().getName();
            
            try
            {
                system.debug('\n\n\n-------------------- Looking at ' + childObjectDescribe.getName() + ' which is a child object of ' + objectDescribe.getName());
                
                if(!childObjectDescribe.isCreateable() || !childObjectDescribe.isQueryable() || !childObjectDescribe.isCustom())
                {
                    system.debug('-------------------- Object is not one of the following: queryable, creatable, or custom. Skipping attempting to clone this object');
                    continue;
                }
                if(Limits.getQueries() >= Limits.getLimitQueries())
                {
                    system.debug('\n\n\n-------------------- Governor limits hit. Must abort.');
                    return clones;
                }
                //create a select all query from the child object type
                string childDataQuery = buildSelectAllStatment(childObjectDescribe.getName());
                
                //add a where condition that will only find records that are related to this record. The field which the relationship is defined is stored in the maps value
                childDataQuery+= ' where '+relationshipField+ ' in :objectIds';
                
                //get the details of this object
                list<sObject> childObjectsWithData = database.query(childDataQuery);
                
                if(!childObjectsWithData.isEmpty())
                {               
                    map<id,id> childRecordSourceToClone = new map<id,id>();
                    
                    for(sObject thisChildObject : childObjectsWithData)
                    {
                        childRecordSourceToClone.put(thisChildObject.Id,null);
                        
                        //clone the object
                        sObject newClone = thisChildObject.clone();
                        
                        //since the record we cloned still has the original parent id, we now need to update the clone with the id of it's cloned parent.
                        //to do that we reference the map we created above and use it to get the new cloned parent.                        
                        system.debug('\n\n\n----------- Attempting to change parent of clone....');
                        id newParentId = sourceToCloneMap.get((id) thisChildObject.get(relationshipField));
                        
                        system.debug('Old Parent: ' + thisChildObject.get(relationshipField) + ' new parent ' + newParentId);
                        
                        //write the new parent value into the record
                        newClone.put(thisRelationship.getField().getDescribe().getName(),newParentId );
                        
                        //add this new clone to the list. It will be inserted once the deepClone function is called again. I know it's a little odd to not just insert them now
                        //but it save on redudent logic in the long run.
                        newClones.add(newClone);             
                    }  
                    //now we need to call this function again, passing in the newly cloned records, so they can be inserted, as well as passing in the ids of the original records
                    //that spawned them so the next time the query can find the records that currently exist that are related to the kind of records we just cloned.                
                    clones.addAll(deepCloneBatched(newClones,childRecordSourceToClone));                                  
                }                    
            }
            catch(exception e)
            {
                system.debug('\n\n\n---------------------- Error attempting to clone child records of type: ' + childObjectDescribe.getName());
                system.debug(e); 
            }            
        }
        
        return clones;
    }
     
    /**
    * @description create a string which is a select statment for the given object type that will select all fields. Equivilent to Select * from objectName ins SQL
    * @param objectName the API name of the object which to build a query string for
    * @return string a string containing the SELECT keyword, all the fields on the specified object and the FROM clause to specify that object type. You may add your own where statments after.
    **/
    public static string buildSelectAllStatment(string objectName){ return buildSelectAllStatment(objectName, new list<string>());}
    public static string buildSelectAllStatment(string objectName, list<string> extraFields)
    {       
        // Initialize setup variables
        String query = 'SELECT ';
        String objectFields = String.Join(new list<string>(Schema.getGlobalDescribe().get(objectName).getDescribe().fields.getMap().keySet()),',');
        if(extraFields != null)
        {
            objectFields += ','+String.Join(extraFields,',');
        }
        
        objectFields = objectFields.removeEnd(',');
        
        query += objectFields;
    
        // Add FROM statement
        query += ' FROM ' + objectName;
                 
        return query;   
    }

You should be able to just copy and paste that into a class, invoke the deepCloneBatched method with the record you want to clone, and it should take care of the rest, cloning every related record that it can. It skips non custom objects for now (because I didn’t need them) but you can adjust that by removing the if condition at line 81 that says

|| !childObjectDescribe.isCustom()

And then it will also clone all the standard objects it can. Again this is kind of a ‘rough draft’ but it does seem to be working. Even cloning 111 records of several different types, I was still well under all governor limits. I’d explain more about how it works, but the comments are there, it’s 3:00 in the morning and I’m content to summarize the workings of by shouting “It’s magic. Don’t question it”, and walking off stage. Let me know if you have any clever ways to make it more efficient, which I have no doubt there is. Anyway, enjoy. I hope it helps someone out there.

Create an Alexa skill in Node.Js and host it on Heroku

Ok, here we go.

So for the last couple weeks I’ve been totally enamored with working with Alexa. It’s really fun to have a programming project that can actually ‘interact’ with the real world. Until now I’ve just been using my hacked together web server with If This Then That commands to trigger events. After posting some of my work on Reddit, I got some encouragement to try and develop and actual Alexa skill, instead of having to piggyback off IFTTT. With some sample code, an awesome guy willing to help a bit along the way and more than a little caffeine I decided to give it a shot.

The Approach: Since I already have a decent handle on Node.Js and I was informed there are libraries for working with Alexa, I decided Node would be my language of choice for accomplishing this. I’ll be using alexa-app-server and alexa-app Node.Js libraries. I’ll be using 2 github repos, and hosting the end result on Heroku.

How it’s done: I’ll be honest, this is a reasonably complex project with about a million steps, so I’ll do my best to outline them all but forgive me if I gloss over a few things. We will be hosting our server in github, creating a repo for the server, making a sub module for the skill, and deploying it all to Heroku. Lets get started. First off, go and get yourself a github account and a Heroku account. If you haven’t used git yet, get that installed. Also install the heroku toolbelt (which may come with git, I can’t quite remember). Of course you’ll also need node.js and Node Package Manager (NPM) which odds are you already have.

If you don’t want to create all the code and such from scratch and want to just start with a functioning app, feel free to clone my test app https://github.com/Kenji776/AlexaServer1.git

Create a new directory for your project on your local machine for your project. Then head on over to github and create yourself a new project/repo. Call it AlexaServer or something similar. Go ahead and make it a public repo. Do not initialize it with a readme at this time. This is the repo where the core server code will live. It’s important to think of the server code as a separate component from each individual skill, they are distinct things. You should see this screen.

repo1

Open a command prompt and change to the directory you created for your project. Enter the commands show in the first section for creating a new repository. Once those commands are entered if you refresh the screen you shoud see your readme file there with the contents shown like this.

github1

Okay, now we are ready to get the Alexa-Server app, https://www.npmjs.com/package/alexa-app-server is what you are looking for. In your project directory type in

“npm install alexa-app-server”

This will take a few moments but should complete without any problems. In your project folder you’ll want to create a folder called apps. This is where each individual skill will live. We will cover that later. Now you’ll need to create your actual server file. Create a file called server.js. Put this in there.

'use strict';

var AlexaAppServer = require( 'alexa-app-server' );

var server = new AlexaAppServer( {
	httpsEnabled: false,
	port: process.env.PORT || 80
} );

server.start();

Pretty simple code overall. The weird bit of code in the port section is for Heroku (they give your app a port to use when hosted). If not on Heroku then it will default to using port 80. Now you need to create your Procfile. This is going to tell Heroku what to do when it tries to run your program. It should be a file named Procfile in the same directory with no file extension. The contents of which are simply

“web: node server.js”

without quotes. We will also want to create a package.json file. So again in your project directory run the command

npm init

This will run a script and it will ask you a few questions. Answer them and your package.json file should get generated. Go ahead and push this all into Github using the following command sequence.

git add .
git commit -m “added server.js and Procfile, along with alexa-app-server-dependency”
git push origin master

If you view your Github repo online you should see all your files there. It should look something like this.

first repo push

 

You can see all of our files got pushed in. Now with the server setup, it’s time to create our skill. Create another GitHub repo. Call this whatever you like, hopefully something descriptive of the skill you are making. In your command prompt get into the apps directory within your main project. In there create another folder with a name same or similar to your new GitHub repo. Follow the same steps as before to initialize the repo and do the initial commit/push. Now we are going to indicate to GitHub that the apps/test-skill folder is actually a sub module so any dependencies and such will be maintained within itself and not within the project root. To do this navigate to the root project folder and enter.

git submodule add https://github.com/Kenji776/AlexaTestSkill.git apps/test-skill

Replacing the github project url with the one for your skill, and the apps/test-skill with apps/whatever-your-skill-is-named. Now Git knows that this folder is a submodule, but NPM doesn’t know that. If you try and install anything using NPM for this skill it’s going to toss it into the root directory of the project. So we generate a package.json file for this skill and then NPM knows that this skill is a stand alone thing. So run

npm init

Again and go through all the questions again. This should generate a package.json file for your skill. Now we are ready to install the actual alexa-app package. So run…

npm install alexa-app –save

and you should see that the skill now has it’s own node_modules folder, in which is contained the alexa-app dependency. After this you’ll have to regenerate your package.json file again because you’ve now added a new dependency. Now it’s time to make our skill DO something. In your skill folder create a file called index.js. Just to get started as a ‘hello world’ kind of app, plug this into that file.

module.change_code = 1;
'use strict';

var alexa = require( 'alexa-app' );
var app = new alexa.app( 'test-skill' );


app.launch( function( request, response ) {
	response.say( 'Welcome to your test skill' ).reprompt( 'Way to go. You got it to run. Bad ass.' ).shouldEndSession( false );
} );


app.error = function( exception, request, response ) {
	console.log(exception)
	console.log(request);
	console.log(response);	
	response.say( 'Sorry an error occured ' + error.message);
};

app.intent('sayNumber',
  {
    "slots":{"number":"NUMBER"}
	,"utterances":[ 
		"say the number {1-100|number}",
		"give me the number {1-100|number}",
		"tell me the number {1-100|number}",
		"I want to hear you say the number {1-100|number}"]
  },
  function(request,response) {
    var number = request.slot('number');
    response.say("You asked for the number "+number);
  }
);

module.exports = app;

Now, if you aren’t familiar with how Alexa works, I’ll try and break it down for you real quick to explain what’s happening with this code. Every action a skill can perform is called an intent. It is called this because ideally there are many ways a person might trigger that function. The might say “say the number {number}” or they might say “give me the number {number}” or many other variations, but they all have the same intent. So hence the name. Your intent should account for most of the common ways a user a might try to invoke your functions. You do this by creating utterances. Each utterance represents something a user might say to make that function run. Slots are variables. You define the potential variables using slots, then use them in your utterances. There are different types of data a slot can contain, or you can create your own. Check out https://developer.amazon.com/public/solutions/alexa/alexa-skills-kit/docs/alexa-skills-kit-interaction-model-reference for more information on slots types. I’m still figuring them out myself a bit. So the intent is triggered by saying one of the utterances. The slot is populated with the number the person says, and after reading the variable from the request, it is read back to the user.

So now, believe it or not, your skill is usable. You can test it by heading by starting your server. Again in your command line shell within the root project directory type

node server.js

A console window should show up that looks like this.

skill running

Now in your browser you should be able to bring up an emulator/debugger page be heading to

http://yourserver/alexa/your-skill-name       (EX: http://localhost/alexa/test-skill)

You should get a page that looks like this.

emulator

Holy cow, we did it! We have a functioning skill. The only problem now is that there is no way for Alexa to ‘get to it’. As in, Alexa doesn’t know about this skill, so there is no way for echo to route requests to it (also, I find it mildly creepy that when referring to Alexa, I keep accidentally wanting to type ‘her’. Ugh). So now we have to put this skill somewhere were it’s accessible. That is where Heroku is going to come in. Sure we could host this on our local machine, setup port forwarding, handle the SSL certs ourselves, but who wants to have their skill always running on their local box? I’ll do a bit on how to do that later since it requires creating a self signed SSL certificate using openSSL and such but for now I’m going to focus on getting this sucker into a cloud host.

Oh by the way, once everything is working smoothly, you should commit and push it all into github. Remember, you’ll have to do a separate commit and push for your server, and for the skill submodule since they are officially different things as far as github is concerned, even though the skill is a sub directory of your server. Also, I’m still learning exactly how submodules work, but if for some reason it doesn’t seem like your submodule is updating properly you can navigate to the project root folder and try this series of commands.

git submodule foreach git pull origin master
git add .
git commit -m “updated submodule”
git push origin master

with some luck that should do it. You’ll know it’s all working right when you open up your server repo in github, click the apps folder and you see a greyish folder icon for the name of your skill. Clicking it should show you the skill with all the files in there. Like this.

submoduleNow, on to Heroku. This part is pretty easy comparatively. Head on over to Heroku and create a new app. As usual, name it whatever you like, preferably something descriptive. Next you’ll be in the deploy area. You’ll be able to specify where the code comes from. Since we are smart, and we used github you can actually just link this heroku project straight to your github project.

hreoku connect
Just like that! However, this automatic integration does not seem to include sub modules, so we will have to add our skill submodule from the command line. So again navigate to your project folder from the shell, and run this command to link your directory with the heroku project.

heroku git:remote -a alexatestapp

Of course replacing alexatestapp with whatever the name of your heroku app is. You should get back a response saying that the heroku remote has been added. Now we need to push it all into Heroku (which in retrospect may have made the above linking done from the website un-needed, but whatever. It doesn’t seem to hurt anything). So now run

git push heroku master

You should be treated with a whole bunch of console output about compressing, creating runtimes, resolving dependencies, blah blah blah. Now with some luck your app should work via your hosted Heroku app. Just like when hosted locally we can access the emulator/debugger by going to

https://alexatestapp.herokuapp.com/alexa/test-skill

Replacing the domain with your heroku app, and the skill with whatever your skill is named. If all has gone well it should operate just like it did on your local machine. We are now finally ready to let Amazon and Echo know about our new skill. So head on over to https://developer.amazon.com/edw/home.html#/skills/list (you’ll need to sign up for a developer account if you do not have one) and click the add new skill button.

app create 1

On the next page it’s going to ask you for your intent schema. This is just a JSON representation of your intents and their slots. Thankfully that handy debug page generates that for you. So it’s just copy paste the intent schema from the debug page into that box. It will ask you about custom slot types, and odds are if you are making a more complicated app you are going to want to make your own, but really all that is is creating a name for your data type, common values for it (no it doesn’t have to be every value) and saving it. Spend some time reading the docs on custom slot types if you need to. It’s also going to want sample utterances, which again that debug page generates for you, so again, just copy paste. You might get a few odd errors here about formatting, or wrong slot types. Just do your best to work through them.

model

 

Next up is setting up the HTTPS. Thankfully Heroku handles that for us, so we don’t have to worry about it! Just select

“My development endpoint is a subdomain of a domain that has a wildcard certificate from a certificate authority”

Next up us account linking. I haven’t figured that one out yet, still working on it, so for now we just have to say no. Next up is the test screen. If everything is gone well up to this point, it should work just peachy for ya.

testsay

Next you’ll be asked to input some meta data about your app. Description, keywords, logo, icon, etc. Just fill it out and hit next. Finally you’ll be asked about privacy info. For now you have just say no to everything, and come back to update that when you’ve got your million dollar app ready to go. Hit save and you should be just about ready to test through your Echo. If you open your Alexa app on your phone and look for your skill by name, you should find it and it should be automatically enabled. You should now be able to go to your echo, use the wake word, speak the invocation name and ask for a number. If all has gone well you’ll hear the wonderful sound of Alexa reading back your number to you. For example

“Alexa Ask Test Get My Number”

Alexa should respond with what you defined as the app start message in your skill. You can now say

“Say the number 10”

Which should then be repeated back to you. Congrats. You just created a cloud hosted, Node.Js Alexa Skill that is now ready for you to make do something awesome. Get out there and blow my mind. Good luck!

Making Amazon Echo/Alexa order me a pizza

UPDATE! I’ve started a github project for my server software. Still very alpha but you can check it out here: https://github.com/Kenji776/AlexaHomeHub

If you follow my blog, you might have caught my post yesterday about how I bought an Amazon Echo device, and have begun creating my own custom actions for it. I started small, simply making it call out to my web server by using If This Then That (IFTTT – a website that allows you to connect and integrate different services). I got it to connect to my door lock and unlock service I had written, and even got it to chromecast specific pre-setup videos to one of my TV’s using a command line tool. Feeling somewhat confident I decided it was time to take on something a little more in depth, but oh so worth it. I was going going to make Alexa order me a pizza.

If you are an Echo/Alexa user you might know that there is already support for ordering a pizza but only from Dominos. It uses some system of having a saved order, then tweeting a specific bit of info at the Dominos twitter account that is tied to your order which then places it. This has a few drawbacks. Primarily being that it orders you a Dominos pizza (sorry guys, in all fairness Dominos has gotten a lot better in recent years). Also it requires twitter integration, and as far as I know only supports one order that you have saved (I could be wrong). The using a saved order was a good idea as it streamlines and simplifies the ordering process quite nicely. I wanted to do something like this, but instead I wanted Sarpinos pizza, and I wanted to be able to pick from several different pre-created orders. Using my knowledge of browser automation that I picked up from my door lock/unlock project and my existing web server, I got to work.

First off, I had to figure out all the things that needed to happen. Off the bat I knew I’d be using their online ordering interface. They don’t have an API, so I knew I’d have to be automating browser interactions. Next I had to break down the process of ordering the pizza online step by step, all the HTML elements involved and how to interact with them. Then I would be able to automate those interactions using the Selenium library. So I went through the process like a normal person and created this list. At each step I inspected the HTML elements involved and recorded them so I could figure out how to identify them and interact with them later. I created an order and saved it as a favorite so next time I’d come back in I would be prompted if I’d like to order that again. From there I was able to create the following list of things I knew needed to get done.

Order Steps:

1) Invoke: https://order.gosarpinos.com/Login/

2) Wait For Load

3) On load populate credential fields:
	- <input class="text-box single-line valid" data-val="true" data-val-required="Email is required" id="Email" name="Email" type="email" value="" >
	- <input class="text-box single-line password valid" data-val="true" data-val-required="Password is required" id="Password" name="Password" type="password" value="" >

4) Click Login button
	- <input type="submit" value="Login" class="ui-button ui-widget ui-state-default ui-corner-all" role="button" aria-disabled="false">
	
5) Wait For Page Load

6) Find button with provided favourite id (428388)
	- <button class="wcFavoriteSelectFavoriteButton ui-button ui-widget ui-state-default ui-corner-all ui-button-text-only" data-favorite-id="428388" id="wiSelectFavorite428388" type="button" role="button" aria-disabled="false">

7) For For Delivery Popup to load
	- 

 

A fair amount of steps, but none of them super complicated. I knew I’d have to learn a bit more about Selenium as this interaction was definitely more complicated than the door lock code, and that one was already seemingly over complicated. Thankfully I did, and found out that in my previous attempt I had been mixing synchronous and async methods unknowingly hence leading to perceived complexity (I thought driver.wait() was a async method and you put everything that depended on it inside. Turns out it’s synchronous and once the condition inside is true the program continues. No wonder it was acting a little funny). I knew also since I was going to be passing in a fair amount of data (username, password, order id, credit card info, etc) that I should probably define an object which would have all the required properties, then just pass JSON into my web service that mirrored that object. This is what I came up with.

{
"action":"pizza",
"username":"xxxxxxxx@xxxx.com",
"password":"xxxxx",
"orderId":"428388",
"ccNumber":"xxxxxxxxxxxxxxxxxx",
"ccExpMonth":"03",
"ccExpYear":"2018",
"ccv":"xxxxx",
"tip":"5.00",
"ccZip":"xxxxx"
}

 

Obviously the sensitive values are blacked out, but you get the jist. My webserver is already primed to look for post requests that have a JSON payload. The ‘router’ code looks at the ‘action’ attribute to figure out what function to send the payload to. I created a new ‘pizza’ action and related function. Here is that function.

function orderPizza(orderObject,callback)
{
	var orderResult = new Object();
	
	//create instance of selenium web driver
	var driver = new webdriver.Builder().
	withCapabilities(webdriver.Capabilities.chrome()).
	build();

		
	//request the login page with the locks page as the return url
	driver.get('https://order.gosarpinos.com/Login');
	
	driver.findElement(webdriver.By.name('Email')).sendKeys(orderObject.username);
	driver.findElement(webdriver.By.name('Password')).sendKeys(orderObject.password);
	
	//find and click submit button
	driver.findElement(webdriver.By.css("input[type='submit']")).submit();

	console.log('Logged in as: ' + orderObject.username);

	//wait for order page to load
	driver.wait(function() {
		return driver.getTitle();
	},5000);
	
	console.log('Attempting To Choose Favorite Order With Id: ' + orderObject.orderId);
		
	//have to wait until the proper order button appears since it's in a dialog. If it isn't found after 5 seconds, fail. Otherwise click the corresponding order button
	//the favorite order button has an attribute 'type' of 'button' and a 'data-favorite-id' attribute with the id of that order
	
	driver.wait(function () {
		return driver.findElement(webdriver.By.css("div[aria-describedby='wiFavoriteListDialog']")).isDisplayed();
	}, 5000);
	
	
	driver.wait(function () {
		return driver.findElement(webdriver.By.css("button[data-favorite-id='"+orderObject.orderId+"']")).isDisplayed();
	}, 5000);
	
	driver.findElement(webdriver.By.css("button[data-favorite-id='"+orderObject.orderId+"']")).click();
			
	//after the button above is clicked, that dialog closes and another one opens. This one asks the user to select delivery or pickup. We want delivery
	//the delivery button has an attribute with a 'data-type' of 'WBD' and an attribute 'role' of 'button'
	driver.wait(function () {
		return driver.findElement(webdriver.By.css("button[data-type='WBD'][role='button']")).isDisplayed();
	}, 5000);
	
	driver.findElement(webdriver.By.css("button[data-type='WBD'][role='button']")).click();
	
	//ensure the checkout button is visible
	driver.wait(function () {
		return driver.findElement(webdriver.By.css("button[id='wiLayoutColumnGuestcheckCheckoutBottom'][role='button']")).isDisplayed();
	}, 5000);
	
	//hacky method to ensure that the modal dialog should now be gone and we can click the checkout button
	driver.sleep(2000);
	
	driver.findElement(webdriver.By.css("button[id='wiLayoutColumnGuestcheckCheckoutBottom'][role='button']")).click(); 
	
	//then the browser will move to the order screen. Once it loads we have to populate the order field data

	//wait for payment page to load
	driver.wait(function() {
		return driver.getTitle();
	},5000);	

	//wait until the pay by credit card button shows up.
	driver.wait(function () {
		return driver.findElement(webdriver.By.id("wiCheckoutPaymentCreditCard")).isDisplayed();
	}, 5000);

	//check the pay by credit card radio button
	driver.findElement(webdriver.By.id("wiCheckoutPaymentCreditCard")).click(); 
	
	//wait until the credit card number box shows up
	driver.wait(function () {
		return driver.findElement(webdriver.By.id("Payment_CCNumber")).isDisplayed();
	}, 5000);
	
	//populate the form fields
	driver.findElement(webdriver.By.name('Payment_CCNumber')).sendKeys(orderObject.ccNumber);

	//stupid jQuery ui selects are impossible to set with normal selenium since the original select is hidden. So use an execute script to set em.
	driver.executeScript("$('#Payment_ExpMonth').val("+orderObject.ccExpMonth+");");

	driver.executeScript("$('#Payment_ExpYear').val("+orderObject.ccExpYear+");");
		
	driver.findElement(webdriver.By.name('Payment_CVV')).sendKeys(orderObject.ccv);
	//driver.findElement(webdriver.By.name('Payment_CCTip')).sendKeys(parseFloat(orderObject.tip));
	driver.findElement(webdriver.By.name('Payment_AVSZip')).sendKeys(orderObject.ccZip);
	
	driver.executeScript("$('#Payment_CCTip').val("+parseFloat(orderObject.tip)+");");
	
	driver.findElement(webdriver.By.id('wiCheckoutButtonNext')).click();

	//wait for confirmation page to load.
	driver.wait(function() {
		return driver.getTitle();
	},5000);

	
	driver.findElement(webdriver.By.id("wiPlaceOrderNow")).click();	

	driver.wait(function() {
		return driver.getTitle();
	},5000).then(function(){
		console.log('Ordering complete!');
		orderResult.success = true;
		orderResult.message = 'Order Placed Successfully';
		callback(orderResult);			
	});
	

}

 

Now if that code seems a little dense or confusing, don’t feel bad. It took me several hours of trial and error to figure it out, especially then it came to setting the select list values, and getting the script to wait while various elements where created and destroyed by the page. Selenium has this super fun behavior where if you ever try and reference an element that doesn’t exist, the whole script goes down in flames. In response to that I made my code very ‘defensive’ checking to make sure elements that are required frequently before attempting to interact with them.

With the script created and integrated into my web server ‘router’ I was ready to get IFTTT to invoke it. Once again it was as simple as creating a new recipe with Alexa as the If and the Maker make a web request feature as the do.

pizza 1pizza2

You can see that with the combination of specific phrases and the fact that you can have multiple saved orders, it would be easy to setup many different possibilities. My roommate is even going to create his own IFTTT account and link it to my Alexa. Then he can create his own orders, specify his own credit card information in the JSON payload, and order whenever he wants using the same device but have his own information. The next step I think is to encrypt the JSON payloads which contain the credit card info and then decrypt them when the arrive at my server. That way I’m not storing my CC info in plain text anywhere which right now is a bit of a concern. This was mostly just proof of concept stuff last night, but I was too excited not to share this as soon as I could, so some of the ‘polish’ features are missing but overall I think it’s a damn good start. Now if you’ll excuse me, I’m going to get myself a pizza.

Update: Adding encryption was pretty easy. First I got some encrypt and decrypt functions set up. Like this.
.

// Nodejs encryption with CTR
var crypto = require('crypto'),
    algorithm = 'aes-256-ctr',
    password = 'xxxxxxxxxxxxxxxxxxxxxxxxxx';

function encrypt(text){
  var cipher = crypto.createCipher(algorithm,password)
  var crypted = cipher.update(text,'utf8','hex')
  crypted += cipher.final('hex');
  return crypted;
}
 
function decrypt(text){
  var decipher = crypto.createDecipher(algorithm,password)
  var dec = decipher.update(text,'hex','utf8')
  dec += decipher.final('utf8');
  return dec;
}

Then update my incoming data object so that the encrypted data was in its own property so I could still tell what kind of request it was without having to decrypt the payload first (since my webserver supports other, unencrypted calls).

"action":"pizza",
"data":"f613f8f479bad299bdfedf [rest of encrypted string omitted]",
"encrypted":true

 

Then just had to change my ‘router’ to decrypt the incoming data if encryption was detected.

else if(action == 'pizza')
{
	//pizza request contains encrypted info. Decrypt and send to function
	var pizzaRequestData = new Object();
	
	if(parsedContent.encrypted)
	{
		console.log('Encryped Payload Detected. Decrypting Containted Data');
		
		pizzaRequestData = JSON.parse(decrypt(parsedContent.data));
		
		console.log('Decryption complete');

		responseObject.message = 'Ordering Pizza!';
		
		console.log(responseObject.message);
		//because order pizza is async the result data comes in a callback
		orderPizza(pizzaRequestData,function(data){
			responseObject.pizzaRequest = data;
			console.log(data);
			sendResponse(response,responseObject);
			return;
		});
		}
	else
	{
		console.log('Un-encrypted pizza order detected. Skipping');
	}
}

After that I just had to use the encrypt function to generate an encrypted version of my pizza request data, update the IFTTT recipe with the new request and that’s it! Now my CC information is safely encrypted and I don’t really have to worry about it getting intercepted. Yay security.

Amazon Alexa is going to run/ruin my life

It was my birthday recently, just turned 28. As a gift to myself I finally decided to order an Amazon Alexa cause I’ve wanted one since I heard about it a few months ago. If you aren’t familiar it’s basically like a ‘siri’ or ‘cortana’ thing that is a stand alone personal assistant device that lives in your home. It’s always on and responds to voice commands from surprisingly far away. It can tell you the weather, check your calendar, manage your shopping list and all that kind of nifty stuff. However, it can do more, much more. Thanks to the ability to develop custom ‘skills’ (their name for apps) and out of the box If This Then That (IFTTT) integration you can quickly start making Alexa do just about anything. I’ve owned it only a day now and I’ve already taught it two new tricks.

Also, if you aren’t familiar with IFTTT it’s an online service that basically allows you to create simple rules that perform actions (hence the name, if this then that). They have the ability to integrate all kinds of different services so you no longer have to be an advanced programmer to automate much of your life. It’s a cool free service and I’d highly recommend checking it out.

You may remember a while back I did that whole write about about making an automatic door locking service software to lock and unlock my front door. I figured a good way to jump into making custom commands would be if I could to see if I could teach Alexa to do it for me upon request. Turns out it was surprisingly easy. Since I already had the web service up and running to respond to HTTP post requests, I simply needed to create an IFTTT rule to send a request when Alexa heard a specific phrase. You may recall that I had some problems with IFTTT not seeming to work for me before, but it seems to now, might have been an error on my part for all I know. Here is the rule as it stands currently.

door 1door 2

Every command issued to Alexa starts with the ‘wake word’ in this case I’ve chosen Alexa (since you can only pick between Alexa, Echo, and Amazon). Second is the command to issue so it knows what service to route the request to. For this the command is ‘trigger’ so Alexa knows to send the request to IFTTT. Then you simple include the phrase to match, and what to do. I decided to make the phrase ‘lock the door’ which when that happens will send a post request to my web server is listening with the given JSON payload. Boom done.

The next thing I wanted to do, and this is still just a very rough outline of a final idea is Chromecast integration. Ideally I’d like to be able to say ‘Alexa trigger play netflix [moviename]’ but as of right now triggers created from IFTTT for Alexa can’t really contain variables aside from just the whole command itself. So I could do ‘Alexa trigger netflix bojack horseman’ and create a specific request just for that show, but there is no way to create a generic template kind of request and pass on the details to the web service that is listening. That aside, what I do have currently is a start.

I found a command line tool that can interact with the chromecast (check this guide for  Command Line Chromecast), and then created a execute statment to call that from my web service. My door lock and unlock service already has logic for handling different commands so I just created a new one called ‘play’ that plays my test video.

else if(action == 'play')
{
	console.log('Casting Requested Thing!');
	var exec = require('child_process').exec;
	var cmd = 'castnow c:\\cast\\testVideo.mp4 --device "Upstairs Living Room"';

	exec(cmd, function(error, stdout, stderr) {
	});					
}

So that turned out to be pretty easy. Small caveat being that castnow is more meant to be an application that is kept open and you interact with to control the video. Since it is being invoked via a web service call it doesn’t really get to ‘interact’ with it. I suppose you might be able to do some crazy shit like keeping open a web socket and continue to pass commands to it, but that’s for another day.

The IFTTT command is basically the same as the door lock one. Just change the command to trigger it, and change the JSON payload to have the action as “play” instead of “lock” or “unlock” and the command gets triggered. I also created a corollary rule and bit of code for stopping the casting of the current video by playing another empty video file (since there isn’t an explicit stop command in the castnow software).

There you have it, with Alexa, IFTTT, and a home web server you can start to do some pretty cool customized automation stuff. I think next up is getting it to order my favorite local pizza for me😀

Mimicking callback functions for Visualforce ActionFuncitons

Hey everyone. So I’ve got a nifty ‘approach’ for you this time around. So let me give you a quick run down on what I was doing, the problem I encountered and how I decided to solve it using what I believe to be a somewhat novel approach. The deal is that I have been working on a fairly complicated ‘one page’ app for mobile devices. What I decided to do was have one parent visualforce page, and a number of components that are hidden and shown depending on what ‘page’ the user is on. This allows for a global javascript scope to be shared between the components and also for them to have their own unique namespaces as well. I may cover the pros and cons of this architecture later.

The issue I started to have, is that I wanted some action functions on the main parent container page to be used by the components in the page. That’s fine, no problem there. The issue becomes the fact that since actionFunctions are asynchronous, and do not allow for dynamic callback functions anything that wants to invoke your actionFunction is stuck having the same oncomplete function as all the functions that may want to invoke it. So if component A and component B both want to invoke ActionFunctionZ they both are stuck with the same oncomplete function, and since it’s async there is no good way to know when it’s done. Or is there?

My solution to this problem doesn’t use any particularity amazing hidden features, just a bit of applied javascript knowledge. What we are going to do is create a javascript object in the global/top level scope. That object is going to have properties that match the names of action functions. The properties will contain the function to run once the action function is complete. Then that property will be deleted to clean up the scope for the next caller. That might sound a little whack. Here let’s check an example.

    <style>
        #contentLoading
        {
            height: 100%;
            width: 100%;
            left: 0;
            top: 0;
            overflow: hidden;
            position: fixed; 
            display: table;
            background-color: rgba(9, 9, 12, 0.5);  
              
        }
        #spinnerContainer
        {
            display: table-cell;
            vertical-align: middle;        
            width:200px;
            text-align:center;
            margin-left:auto;
            margin-right:auto;
        }

        div.spinner {
          position: relative;
          width: 54px;
          height: 54px;
          display: inline-block;
        }
        
        div.spinner div {
          width: 12%;
          height: 26%;
          background: #fff;
          position: absolute;
          left: 44.5%;
          top: 37%;
          opacity: 0;
          -webkit-animation: fade 1s linear infinite;
          -webkit-border-radius: 50px;
          -webkit-box-shadow: 0 0 3px rgba(0,0,0,0.2);
        }
        
        div.spinner div.bar1 {-webkit-transform:rotate(0deg) translate(0, -142%); -webkit-animation-delay: 0s;}    
        div.spinner div.bar2 {-webkit-transform:rotate(30deg) translate(0, -142%); -webkit-animation-delay: -0.9167s;}
        div.spinner div.bar3 {-webkit-transform:rotate(60deg) translate(0, -142%); -webkit-animation-delay: -0.833s;}
        div.spinner div.bar4 {-webkit-transform:rotate(90deg) translate(0, -142%); -webkit-animation-delay: -0.75s;}
        div.spinner div.bar5 {-webkit-transform:rotate(120deg) translate(0, -142%); -webkit-animation-delay: -0.667s;}
        div.spinner div.bar6 {-webkit-transform:rotate(150deg) translate(0, -142%); -webkit-animation-delay: -0.5833s;}
        div.spinner div.bar7 {-webkit-transform:rotate(180deg) translate(0, -142%); -webkit-animation-delay: -0.5s;}
        div.spinner div.bar8 {-webkit-transform:rotate(210deg) translate(0, -142%); -webkit-animation-delay: -0.41667s;}
        div.spinner div.bar9 {-webkit-transform:rotate(240deg) translate(0, -142%); -webkit-animation-delay: -0.333s;}
        div.spinner div.bar10 {-webkit-transform:rotate(270deg) translate(0, -142%); -webkit-animation-delay: -0.25s;}
        div.spinner div.bar11 {-webkit-transform:rotate(300deg) translate(0, -142%); -webkit-animation-delay: -0.1667s;}
        div.spinner div.bar12 {-webkit-transform:rotate(330deg) translate(0, -142%); -webkit-animation-delay: -0.0833s;}
    
         @-webkit-keyframes fade {
          from {opacity: 1;}
          to {opacity: 0.25;}
        }    	
	</style>
	
		var globalScope = new Object();
		
		function actionFunctionOnCompleteDispatcher(functionName)
		{
			console.log('Invoking callback handler for ' +functionName);
			console.log(globalScope.actionFunctionCallbacks);
			
			if(globalScope.actionFunctionCallbacks.hasOwnProperty(functionName))
			{
				console.log('Found registered function. Calling... ');
				console.log(globalScope.actionFunctionCallbacks.functionName);
				globalScope.actionFunctionCallbacks[functionName]();
				delete globalScope.actionFunctionCallbacks.functionName;
			}
			else
			{
				console.log('No callback handler found for ' + functionName);
			}    
		}         
		
		function registerActionFunctionCallback(functionName, callback)
		{
			console.log('Registering callback function for ' + functionName + ' as ' + callback);
			globalScope.actionFunctionCallbacks[functionName] = callback;
			
			console.log(globalScope.actionFunctionCallbacks);
		} 
		
		function linkActionOne(dataValue)
		{
			registerActionFunctionCallback('doThing', function(){
				console.log('Link Action Two was clicked. Then doThing action function was called. Once that was done this happened');
				alert('I was spawened from link action 1!');
			});		
			
			doThing(dataValue);
		}
		
		function linkActionTwo(dataValue)
		{
			registerActionFunctionCallback('doThing', function(){
				console.log('Link Action Two was clicked. Then doThing action function was called. Once that was done this happened');
				alert('I was spawened from link action 2!');
			});		

			doThing(dataValue);
		}

		function loading(isLoading) {
			if (isLoading) 
			{            
				$('#contentLoading').show();
			}
			else {
				$('#contentLoading').hide();
			}	
		}		
    
	
	<apex:form >
		<apex:actionFunction name="doThing" action="{!DoTheThing}" reRender="whatever" oncomplete="actionFunctionOnCompleteDispatcher('doThing');">
			<apex:param name="some_data"  value="" />
		</apex:actionFunction>
		
		<apex:actionStatus id="loading" onstart="loading(true)" onstop="loading(false)" />
	
		<a href="#" onclick="linkActionOne('Link1!')">Link One!</a>
		<a href="#" onclick="linkActionTwo('Link2!')">Link Two!</a>

		
id="contentLoading" style="display:none">
id="spinnerContainer">
class="spinner">
class="bar1">
class="bar2">
class="bar3">
class="bar4">
class="bar5">
class="bar6">
class="bar7">
class="bar8">
class="bar9">
class="bar10">
class="bar11">
class="bar12">
</div> </div> </div> </apex:form>

So what the hell is going on here? Long story short we have two links which both call the same actionFunction but have different ‘callbacks’ that happen when that actionFunction is complete. I was trying to come up with a more interesting example, but I figured I should keep it simple for sake of explanation.  You click link one, the doThing action is called. Then it calls the actionFunctionOnCompleteDispatcher function with it’s own name. That function looks to see if any callbacks have been registered for that function name. If so, it is called. If not, it just doesn’t do anything. Pretty slick eh? You may be wondering why I included all that code with the action status, the loading animation, the overlay and all that. Not really relevant to what we are doing right (though the animation is cool.)? The answer to that is (other than the fact you get a cool free loading mechanism), this approach as it stands will start to run into odd issues if you users clicks link2 before link1 has finished doing it’s work. The callback function registered by 2 would get called twice. Once the call to doThing from link1 one its going to call whatever function is registered, even if that means the click from link2 overwrote what link1 set.  I am thinking you could probably get around this by making the property of the global object an array instead of just a reference to a function. Each call would just push it’s requested callback into the array and then as they were called they would be removed from the array, but I haven’t played with this approach yet (‘I’m pretty sure it would work, I’m just too lazy and tired to write it up for this post. If there is interest I’ll do it later). In any case putting up a blocking loading screen while the action function does its work ensures that the user cannot cause any chaos by mashing links and overwriting callbacks.

The thing that is kind of cool about this which becomes clear pretty quick is that you can start ‘chaining’ callbacks. So you can have a whole series of action functions that all execute in sequence instead of just running async all over the place. So you can do things like this. Also make note of the commenting. The thing about callbacks is you can quickly end up ‘callback hell’ where it gets very difficult to track what is happening in what order. So I at least attempt to label them in an effort to stem the madness. This is just a quick copy paste from the thing I’m actually working on to give you a flavor of how the chaining can work.

//once a project has been created we need to clear out any existing temp record, then set the type of the new/current temp record as being tpe. Then finally
//we have to set the project Id on that temp record as the one we created. Then finally we can change page to the select accounts screen.
VSD_SelectProject.addNewTpeComplete = function()
{

	//order 2: happens after clearTempRecord is done 
	//once the existing temp record has been cleared and a new one is created, get a new temp record and set the type as tpe
	registerActionFunctionCallback('clearTempRecord', function(){
		setRequestType('tpe');
	});
				
	//order 3: happens after setRequestType is done 
	//once set request type is done (which means we should now have a temp record in the shared scope, then call set project id
	registerActionFunctionCallback('setRequestType', function(){
		setProjectId('{!lastCreatedProjectId}');
	});

	//order 4: happens after setProjectId is done 
	//once set project id is called and completed change the page to the new_pcr_start (poorly named, it should actually be called select_accounts
	registerActionFunctionCallback('setProjectId', function(){
		setTitle('New TPE Request');
		setSubHeader('Select Accounts');
					
		changePage('new_pcr_start');
	});
	 
	//order 1: happens first. Kicks off the callback chain defined above.                                                
	clearTempRecord();                
}

Anyway, I hope this might help some folks. I know it would be easy to get around this issue in many cases by just creating many of the ‘same’ actionFunction just with different names and callbacks but who want’s dirty repetitive code like that?

Tune in next time as I reveal the solution to an odd ‘bug’ that prevents apex:inputFields from binding to their controller values. Till next time!

-Kenji

Follow

Get every new post delivered to your Inbox.

Join 710 other followers