Is there more to Amazon IoT than IoT?
The initial rant…
I think ever since I got into technology in 1992 I have always been looking for the right combination of tools to fulfill a specific requirement. In fact back in 1992 there really weren’t as many libraries, utilities and services out there. The plain fact was that you even had to pay for a C compiler and IDE that was delivered on too many floppy disks to count.
So you had to be resourceful and make the most of what there was on offer and try and build what you needed with what was there. The interesting thing about the world of today is that we have too many tools to do the same thing, and to help people navigate the complex landscape of software and services, we have decided to name these tools with their intended purpose.
Amazon IoT is for IoT right?
Well I suppose the short answer to the title is yes. I would actually go as far as saying that it is great for IoT use-cases. What I would like to explore in this blog instead is if there is any other place in which we can use this great piece of tech.
So the first step to understanding technology is to look under the skin and see how it is built. Amazon IoT is actually the implementation of a high speed publish/subscribe event bus which uses a standard protocol (MQTT) based on TCP/IP.
Amazon were also nice enough to add a lot of great features to their “IoT” service such as some automatic consumers of the inbound messages which can forward to services like SNS and DynamoDB. These are great because they will allow you to ingest data from devices without having to write any code at all.
I am a developer at heart and therefore a bit of a masochist and personally think that if you don’t write a bit of code it’s not really that fun. Of course most of us don’t have an IoT device lying around either so the classic ingestion scenario is not really that easy to work with. So I thought it may be interesting to use Amazon IoT for remote IPC.
Remote IPC over IoT
Now that is a mouthful but what exactly are we trying to achieve? Well lets just say that I thought it would be nice to be able to run code on a different computer maybe on a different network and have it stream the data back to me. I also thought it would be a nice challenge to see if this could be achieved using event driven architecture.
MQTT is very fast and the Amazon IoT implementation is even faster. This makes it great for this use-case because the messages are sent back and forth between the different processes quickly. The other challenge is streaming data in a scalable way. The thing with messaging is that messages are exactly that (a bit of data). So we have to look at them as packets when they are part of a data stream.
Essentially once we have implemented this little experiment, we will have a quick, hyper scalable solution for RPC and data streaming. IoT devices stream data anyway right? So this looks like it should work quite nicely.
And the Solution?
So the solution recipe goes as follows:
- 2 Java Applications
- 1 open source MQTT library (Paho from Maven Central)
- 1 open source JSON library (Jackson from Maven Central)
- 1 Amazon IoT Thing
- A pinch of multi-threading (java.util.concurrent)
- 1 Certificate
- Login to AWS and direct yourself towards the Amazon IoT service.
- Create a new “Thing” (Cloudformation will do here as well)
- Download the certificates to Connect to the “Things” endpoint you will need these to use the Paho libraries.
- Create a Java project with 2 separate threads.
- Create two different POJO’s to identify the different types of messages (data and command).
- Make sure that command messages contain a to-do action and send back data as a response.
- Use the java.util.concurrent CountDownLatch to synchronise between publish and subscribe threads.
Our implementation will be publishing command and data messages and will also be listening for command and data messages. The data will be processed or stored, and the commands will be executed. There you have it RPC on Amazon IoT.
All of this sounds pretty straightforward right?
Does it work in production?
Actually once I was done playing games, this solution made it into a customer implementation. It is currently used for a near real-time integration between Sage 200 and Salesforce.com using our Integration Platform as a Service “Cloudreach Connect”.
If you like what you’ve read and would like to try it out for yourself there is no better way to take this tech for a spin than by using our Cloudreach Connect iPaaS. You can ask for a demo account, or you can get started with Connect through the Amazon Marketplace.
For more posts by Chris Stura, please click here