22 July 2014

REST Web Services - Part 1 - CFML

Dammit Adam! If there's one thing I hate more than learning its being inspired to learn!

Last week Adam Cameron got me thinking about REST Web Services, he challenged his readers, what is the best language for building them. Well I had a few thoughts which I shared. Basically it depends which language the developer in question is most comfortable in, and which environment is most accessible. In my time I've been lucky enough to be paid to work in a lot of languages and a lot of environments, so I thought I’d do a quick re-cap of a few languages and maybe even attempt a few which I've never learnt. How long it takes before my patience and time run out, we’ll see :)


My environment will be mostly Ubuntu as that’s what I run at home these days, and I’m too poor to run a server. However if I make it on to Microsoft development I’m anticipating switching to Windows 7. As for the REST services themselves I’m going for Keep It Simple Stupid here, basic proof of concept stuff. I want a GET and a POST method and that’s about it. Also I’m largely ignoring security for the time being.


First up is ColdFusion.
I’ve spent many of my years being paid as a CFML developer and I’m most comfortable with it. I should be able to knock out some REST web services here pretty quick. As much as I like CFML, Rest in CF is a pain in the neck. To me the syntax is not intuitive, mistakes are difficult to pin point, debugging is very difficult and testing a nightmare. Someone reminded me of an add on called Taffy, which is a REST framework for CF. I've never encountered Taffy myself, so I'm hopeful and excited for the challenge.


First I install Railo express, not rocket science, suffice to say it works and in just two minutes I’m running a test cfm page. (Awesome job Railo guys!)


Now I need to get Taffy running. So I download the zip file and extract it. Loosely skimming through the docs I see I can just drop the taffy folder into my Railo root. Awesome, drag and drop, ok, now what? OK So back to the docs, I’ve got to add an extension to my Application.cfc and create a hello.cfc. So not that drag and drop then. After about thirty minutes fiddling around with resources directories and tweaking Application.cfc files I’ve finally got the Taffy welcome page working properly. What held me up was *not* using the Application.cfc supplied in the docs but instead borrowing the one from the help folder. My setup is as such:


Railo/webapps/ROOT/hiya
-- Application.cfc (see below)
-- index.cfm (empty cfm file)
Railo/webapps/ROOT/hiya/resources
-- hello.cfc (my REST webservices)


I now have Taffy running with Railo. This is awesome, the format for web services is pretty good, I don't need to endlessly update Railo or my application config, it just works. The test bed is the best thing, I love the bootstrap layout and it makes it so simple and easy to test. I've put a screenshot down the bottom.

Here's some code.


My Application.cfc is:

<cfcomponent extends="taffy.core.api">
    <cfscript>

        this.name = hash(getCurrentTemplatePath());

        variables.framework = {};
        variables.framework.debugKey = "debug";
        variables.framework.reloadKey = "reload";
        variables.framework.reloadPassword = "true";
        variables.framework.representationClass = "taffy.core.genericRepresentation";
        variables.framework.returnExceptionsAsJson = true;

        function onApplicationStart(){
            return super.onApplicationStart();
        }

        function onRequestStart(TARGETPATH){
            return super.onRequestStart(TARGETPATH);
        }

        // this function is called after the request has been parsed and all request details are known
        function onTaffyRequest(verb, cfc, requestArguments, mimeExt){
            // this would be a good place for you to check API key validity and other non-resource-specific validation
            return true;
        }

    </cfscript>
</cfcomponent>




Finally the important bit here are my web services (hello.cfc):

component extends="taffy.core.resource" taffy_uri="/hello" {

     function get(){
        return representationOf(['all your base are belong to me. :)']);
    }
    
    function post(String name){
        return representationOf(['Nice to meet you ' & name]);
    }

 }
And CFML is done. All this, including install of Railo and Taffy took me around an hour.

If you’re Adam Tuttle, congratulations Taffy is a fantastic product and I'm very impressed. The ease at which I can deploy web services and most importantly test them is now fantastic. I don’t need to create silly little html forms or anything, it’s all built in. This is genius.  If I may offer my two cents here, you've got a little work to do on the docs. Especially the initial deployment. It’s gotta be step by step built for idiots like me. If I can’t make it work quickly, I lose interest. That aside you should be really proud.



07 July 2014

Android Wearables First Go

I thought I’d try the new Android wear SDK and see if I could do anything useful with it. It took a while and a few head scratching moments but I got there in the end. What I wanted to do was send a message from the Android wear watch to my Android phone. Seems like such an easy ask!


First up you need Android Studio. I hope they make it possible with Eclipse / ADT, but I couldn't make it happen and quickly gave up! You also need to be super up to date with your SDK, as of today my versions:


  • Android SDK Tools 23.0.2
  • Android 4.4W (API 20)
  • Android 4.4.2 (API 19)
  • Google Play Services revision 18 (5.0)


Once all that is ready and working without error you need a to create a new project and follow the steps as per the Android developer page: http://developer.android.com/training/wearables/apps/creating.html
Basically create an app for mobile as per usual and a partner application for wear. You also need to setup an emulator or use a real watch. I can’t afford a real wear watch so I'm on the emulator. Follow the steps to connect your phone via usb cable and the emulator, bit fiddly but works eventually. The basic idea here is to use Google Play Services to transfer messages between the watch and the phone with the Message API. I believe this is new which is why it is so important to ensure everything is up to date.


Now the code, first the mobile side of things.


This is the gradle dependancies. My min sdk is 9 and my target sdk is 20.
dependencies {
    compile fileTree(dir: 'libs', include: ['*.jar'])
    wearApp project(':wear')
    compile 'com.android.support:appcompat-v7:19.+'
    compile 'com.google.android.gms:play-services-wearable:+'
}


Now we’re not making a fancy front end here as this is just really a proof of concept. Here’s the MainActivity.java. As you should be able to see this is a fairly simple listener.


public class MainActivity extends Activity implements GoogleApiClient.ConnectionCallbacks,
    GoogleApiClient.OnConnectionFailedListener,
    MessageApi.MessageListener{


Now some member variables:


    GoogleApiClient mGoogleApiClient;
    public static final String START_ACTIVITY_PATH = "/start/MainActivity";


Here is the on create:


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

         // Create a GoogleApiClient instance
        mGoogleApiClient = new GoogleApiClient.Builder(this)
                .addApi(Wearable.API)
                .addConnectionCallbacks(this)
                .addOnConnectionFailedListener(this)
                .build();

         mGoogleApiClient.connect();
}


What we’re doing here is initialising the Google API Client with the Wearable.API. These are the methods overwritten from the implements section.


    @Override
    public void onConnected(Bundle bundle) {
        Log.i("mobile", "Connected");
        //We are connected, we can add our listener to this Activity.
        Wearable.MessageApi.addListener(mGoogleApiClient, this);
    }

     @Override
    public void onConnectionSuspended(int i) {
        Log.i("mobile", "Connection Suspended");
    }

     @Override
    public void onConnectionFailed(ConnectionResult connectionResult) {
        Log.i("mobile", "Connection Failed");
    }

     @Override
    public void onMessageReceived(MessageEvent messageEvent) {
        Log.i("mobile", "msg recieved and understood");

         if (messageEvent.getPath().equals(START_ACTIVITY_PATH)) {
            Log.i("mobile", "******WHOOOO*******");
            //Send a message to a Handler for UI Update.
            myHandler.sendEmptyMessage(DO_UPDATE_TEXT);
        }
    }


That’s the mobile side of things more or less done :) Again, nothing fancy, just proof of concept.


Now the wearable part of the project. Here we’re going to send the message, but again we have to connect to Google API Client.


I’m just going to post the whole file here as it’s probably easier. I’ll skip the layout, as its just a button.


  1. First (in onCreate) we define and connect to our mGoogleApiClient.
  2. OnConnected we start of the getConnectedNodes Async Task. This needs to run seperate from the UI and basically grabs all connected nodes. In our case there is only one, but you should really check here and maybe flash up a dialog or something if there are more or less than one clients connected.


  1. Once that’s done we send a message as such sendMsg(results.get(0));. This sends the node ID we got from the Wearable API and calls the sendMsg function


  1. In SendMsg we call Wearable.MessageApi.sendMessage. As expected this sends our message. Right now the message is meaningless, but you could easily modify this example to send a real message and have the listener display it.


That’s it. Hope it helps, here is the wear project code:



package com.example.com.wearable;

 import android.app.Activity;
import android.os.AsyncTask;
import android.os.Bundle;
import android.support.wearable.view.WatchViewStub;
import android.util.Log;
import android.view.View;
import android.widget.TextView;

 import com.google.android.gms.common.ConnectionResult;
import com.google.android.gms.common.api.GoogleApiClient;
import com.google.android.gms.wearable.MessageApi;
import com.google.android.gms.wearable.Node;
import com.google.android.gms.wearable.NodeApi;
import com.google.android.gms.wearable.Wearable;

 import java.util.ArrayList;
import java.util.Collection;
import java.util.HashSet;

 public class WearActivity extends Activity implements GoogleApiClient.ConnectionCallbacks,
        GoogleApiClient.OnConnectionFailedListener {

     private TextView mTextView;
    GoogleApiClient mGoogleApiClient;
    public static final String START_ACTIVITY_PATH = "/start/MainActivity";

     @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_wear);

         // Create a GoogleApiClient instance
        mGoogleApiClient = new GoogleApiClient.Builder(this)
                .addApi(Wearable.API)
                .addConnectionCallbacks(this)
                .addOnConnectionFailedListener(this)
                .build();

         mGoogleApiClient.connect();

         final WatchViewStub stub = (WatchViewStub) findViewById(R.id.watch_view_stub);
        stub.setOnLayoutInflatedListener(new WatchViewStub.OnLayoutInflatedListener() {
            @Override
            public void onLayoutInflated(WatchViewStub stub) {
                mTextView = (TextView) stub.findViewById(R.id.text);

                 findViewById(R.id.activity_wear_send_msg).setOnClickListener(new View.OnClickListener() {
                    @Override
                    public void onClick(View v) {
                        GetConnectedNodes task = new GetConnectedNodes();
                        task.execute(new String[]{"com"});
                    }
                });

             }
        });
    }

     private Collection<String> getNodes() {
        HashSet<String> results = new HashSet<String>();
        NodeApi.GetConnectedNodesResult nodes = Wearable.NodeApi.getConnectedNodes(mGoogleApiClient).await();
        for (Node node : nodes.getNodes()) {
            results.add(node.getId());
            Log.i("wear", node.getId());
        }
        return results;
    }

     @Override
    public void onConnected(Bundle bundle) {
        Log.i("wear", "Connection success");
        GetConnectedNodes task = new GetConnectedNodes();
        task.execute(new String[] { "" });
    }

     @Override
    public void onConnectionSuspended(int i) {
        Log.i("wear", "Connection suspended");
    }

     @Override
    public void onConnectionFailed(ConnectionResult connectionResult) {
        Log.i("wear", "Connection failed");
    }

     private void sendMsg(String node){
        String msg = "All your base";

         MessageApi.SendMessageResult result = Wearable.MessageApi.sendMessage(mGoogleApiClient, node, START_ACTIVITY_PATH, msg.getBytes()).await();
        if (!result.getStatus().isSuccess()) {
            Log.e("wear", "ERROR: failed to send Message: " + result.getStatus());
        }else{
            Log.i("wear", "Message sent: " + result.getStatus());
        }

     }

     private class GetConnectedNodes extends AsyncTask<String, Void, Void> {
        protected Void doInBackground(String... params) {
            ArrayList<String> results = new ArrayList<String>();
            NodeApi.GetConnectedNodesResult nodes = Wearable.NodeApi.getConnectedNodes(mGoogleApiClient).await();

             Log.i("wear", "Hello from GetConnectedNodes");
            Log.i("wear", "node count:" + String.valueOf(nodes.getNodes().size()));

             for (Node node : nodes.getNodes()) {
                results.add(node.getId());
                Log.i("wear", node.getId());
            }

             if(results.size() > 0){
                sendMsg(results.get(0));
            }
            return null;
        }
    }
}

And here's the gradle:

dependencies {
    compile fileTree(dir: 'libs', include: ['*.jar'])
    compile 'com.google.android.support:wearable:+'
    compile 'com.google.android.gms:play-services:+'
}




03 July 2014

24 Hours with Google Glass

So I've been lucky enough to get my hands on a pair of Google Glass and took them for a spin. Opinion is largely split as to their worth. They've certainly caused some excitement and a lot of chatter around the office.


Recently news has reached us that they've been banned at UK Cinemas, not a huge story but for some reason the media has made a big fuss about it. Why on earth anyone would want to record a whole film using a shaky crummy camera on your head I don't know. Does anyone download poor quality in-cinema recorded films these days?


The first thing everyone asks is about the privacy, are you filming me? toilets etc. I think this is just a fear thing, it's not that people record video where they shouldn't, it is just that they could. I remember the same discussion about camera phones, the amount of people I see texting while in the washroom is barely noticed any more....although it is gross. It’s slightly odd to me that people worry about this but they don’t worry about CCTV.


The second thing, (and I think the most important thing) that strikes most people is what do they do. OK so you put them on and take a photo or two or maybe play with the fantastic star chart app. After that it's a case of....ok now what? I think the big takeaway here is that Glass is re-active. It's great for notifications, texts, emails even phone calls work really well. Glass is happy to read them out to you and you can even reply via voice which works brilliantly. As a thing to "play" with, Glass is not impressive.


Most apps work well with voice, Glass does a good job of presenting the options available to you although it does need to improve. Google Play Music was notably poor here. You can't start nor stop music with voice, instead you have to tap the side. Any application that forces me to tap the side of the glasses loses the point. Why have a hands free wearable I can talk to, if I can only talk to it half of the time! I can tap my phone for that. It's in development though!


Next thing is in the car, obviously I tried this as a passenger :) First up the navigation app, this is very impressive. Turning itself off until a maneuver comes up then piping up with clear instructions and audio. Second is notifications with hands free, as I mentioned before, this is really great hands free and does mostly work well. However it is incredibly distracting. Even though you're still looking in the direction of the car in front, you're not focusing and so I would strongly suggest not using it whilst driving.


This leads nicely into my conclusion, what are they for? The hard cold fact is they are big and look weird. So you're not going to use them out with friends. I was very self conscious in public so avoided wearing them out and about. So if you don't use them when driving, don't use them in public and don't use them when with friends when would you use them? Which begs the question where are Google taking them? Are they hoping we'll all just suck it up and start looking a bit nerdy? Or is it just one big experiment?


So to conclude they are neat, they mostly work well and seem like a great step toward augmented reality. However they *are not* augmented reality, a few killer apps would work great for this but I can’t see that happening. I think the privacy critics will hush eventually. I really don't think much will happen until they make them smaller and more discrete, but maybe this is just Google's aim to nudge the rest of the world toward better lenses and smaller technology. For now I think Google will push wearables like watches more than Glass, watches don’t have cameras! I suspect Glass will become just another experiment or niche product. Maybe the technology for augmented reality just isn't there yet, but good try Google.