Category Archives: Cloud Computing

DeepLens Challenge Hackathon Project Submitted

Nautilus Face Tracker

My DeepLens Challenge Project is called ‘Nautilus Face Tracker’. The goal of my project is to integrate the DeepLens Camera with Alexa, using AWS Cloud as the back-bone and brains of the system, to allow Alexa to recognize faces as well as learn to recognize new faces she has never seen before. My project was more of an AWS Service Integration problem rather than a Machine Learning exercise, really; but I did learn alot about the DeepLens Platform and definitely have my interest piqued in learning more about Machine Learning and Artificial Intelligence. In fact, my girlfriend also has a new interest in the subject presumably just from listening to me work on this project. She even started taking a ChatBot course last week on Coursera as a result. These are fascinating times to work in the Information Technology field (and I shan’t be outdone)!

Here’s the architectural diagram I made to describe and explain, at a high level, how I put my project together. The source code is still in a private repository while the competition takes place. The Hackathon ends on Valentine’s Day (2/14/2018).

Nautilus Face Tracker Architecture

I put this whole thing together using some fairly simple Python and nodejs scripts and a few Lambda Services.  In retrospect, the AWS services and programming APIs are very easy and powerful to use.

Here’s the video I made demonstrating my project.  A video demonstration of working code was a requirement for this project.

So What’s Next?

I definitely want to learn more about Machine Learning. I found some good learning resources as starting points, and Coursera has some good Deep Learning classes as well:

Real-Time Image Rekognition with Tensorflow? Not Exactly…

I’ve got one week left to complete my AWS DeepLens Hackathon Project. I wrote about my decision to participate in the AWS DeepLens Hackathon here. It’s been an enjoyable learning experience so far. I’ve seen a few projects doing some amazing Machine Learning things with real-time image recognition using things like Tensorflow. I can barely spell Tensorflow. Tonight, I was pretty stoked just to get my Alexa Spot to recognize my face by using the vision and face recognition capabilities of the DeepLens.

I just won this Echo Spot at my company Christmas Party a few weeks ago…

Alexa Spot Face Recognition
Alexa Spot Face Recognition

Now to apply some finishing touches and to submit my project.  Tensorflow or not…yolo.

Flow Zone

It’s true…

DeepLens Challenge Day 10: Face Matching with AWS Rekognition

Wrestling Machines

It was the best of times, it was the worst of times. Day 10 (for me) of the DeepLens Challenge (I first blogged about this here). I have made some progress and am now able to match face images, retrieved from the DeepLens Camera, against a face image gallery I built using AWS S3, Lambda, DynamoDB and the Rekognition Service (I used this blog post to get things setup). Using the Rekognition Service was actually pretty straight forward and easy, especially as there is a clear blog outlining how to start using it to go from. Unfortunately, working with the DeepLens Camera is not so easy at times.

  • Downloading Projects from the AWS Console to the DeepLens sometimes get hung up. I found that running
    sudo systemctl restart greengrassd.service

    on the Camera usually kicks it into gear and allows the Project to download. But the build deploy process is time consuming and fraught with missteps.

  • Your Project version can only go up to 9 for some reason, so I was deleting my Project when the Version hit 9. However, I ran into a bug last night where the DeepLens Camera would get Deregistered whenever you deleted an associated Project. So that meant resetting the device to put the on-board Wifi in the right state so that the device could be Registered with Amazon again. Arrrggh! And no deleting Projects until this is over!
  • My DeepLens was automatically updating itself putting my Camera in a bad state as the AWS Camera software was apparently incompatible with the Linux updates I was receiving. I finally figured out how to turn off the automatic updates (done when Registering the DeepLens with AWS), and followed steps to lock-in Linux kernel 4.10.17+.

Flow Zone

This is a cool little song from the immensely talented Martin Garrix. I first heard this song at AWS re:Invent in 2016. The depth of the bass and sharpness of the sounds blew me away, not to mention the psychedelic jelly-fish visuals.

All Your Face Are Belong To Us: DeepLens Challenge Day 5

Keep Hope Alive

This is Day 5 (for me) of the DeepLens Challenge, which I talked about starting in my post here.  I have to submit my project by February 12th or 13th.  I’m making progress toward my project goal, which right now is simply to recognize a face in an image cache from a live video feed using the stock face detection model on the DeepLens device.  Face and image recognition is pretty common place today, I guess, but I’m stoked to get something similar working myself.  I’d also love to integrate Alexa into the mix somehow as well, but I need to start making bigger strides with less messing about with the fiddly things!

Coding Challenges And Solutions

Some of the challenges I’ve faced, and (mostly) overcome, so far include:

  • Cropping a detected face out of the DeepLens video feed in the Lambda Python script.  Turns out this is very simple, but it took me a while to figure out.
  • How to convert the cropped face image to a jpg and write it to disk.  Also very simple in retrospect, but I’m a moron.
  • I thought it would be easy to write the resulting face jpg to AWS S3 from the DeepLens edge device, but this one I just could not figure out due to permission issues.  I can write to S3 using the aws cli as the aws_cam user, but so far I’ve not been able to extend those same permissions to the ggc_user account, which seems is what runs the awscam software.  I even hard-coded credentials in the creation of my S3 client in the lambda code, but still had permission problems.  I had to back-off from hacking on the device out of fear of really screwing something up, however.  Best to stay off the DeepLens as much as possible in retrospect.
  • The only way I was able to get a face image off the DeepLens and into the cloud so far is by converting it to a base64 String, putting into a JSON object, and putting it on the IoT Topic.  I worry that all this data transfer is going to cost me an arm-and-a-leg by the end of this thing…
  • When creating a lambda function to read from the IoT Topic, I kept getting a random error when trying to save it, which made no sense as I was following an AWS Blog Post for how to do the same.  Then I found this:  And this is what makes hackathons using new technology so fun!  Writing software is really just lots of Google Searches.

And speaking of the Internet of Things (IoT), to-date I’ve thought this was just another marketing buzz word that wasn’t going to pan-out, so to speak.  I used to think the same about ‘cloud’ (and still think this about Bitcoin and its ilk).  But this DeepLens development challenge is giving me a greater appreciation for IoT and edge computing.  In fact, we’ve been talking about the proliferation of internet connected things and the resulting possibilities since Java Jini, and probably before that, but I suspect Python will be its great enabler instead of Java at this point.  But I digress…

Baby Steps, But Machine Learning Learning No Where In Sight

So as of today, I am able to leverage the stock face detection model to detect and crop a face out of a live video feed from DeepLens, send it up to the AWS Cloud Lambda IOT Topic Listener, and put it into an S3 Bucket.  Next step is to try to figure out how to use the AWS Rekognition service to recognize face images in an image cache.

The Flow Zone

I’ve found listening to music particularly distracting these last few days.  However, I find this Horn Solo in Tchaikovsky’s 5th Symphony really soothing and not distracting (but too short).  I played this solo in Solo and Ensemble in High School.  I’ve been told that french horn players are better kissers…

AWS DeepLens Hackathon: A Machine Learning Newbie’s Journey Into the Abyss Part 2

Feel the Sweet Pain

So far, I’ve learned some painful, hard-fought, lessons in the last two days. I was initially able to register my DeepLens device with AWS Cloud, no problem. The first hiccup I encountered was when I tried to push one of the pre-made models down to the device. They simply would not go, and there are no logs to look at, as that might be too helpful. So, thinking like a DeepLens device myself, I reasoned I probably screwed-up the IAM roles when I tried to register the device (later I learned my assumption was spot-on). To correct the model push problem I was having, I Deregistered the device hoping I could simply go through the Registration process again making sure my IAM Roles were properly configured. And wouldn’t you know the dang wifi on the device stopped working preventing me from logging in to the device to re-register it with the cloud.

The way the DeepLens currently works is that you can only configure it (and upload the certificates it needs to identify itself with your AWS Account) by using it’s on-board wifi and pointing your web browser (on another computer) to I still can’t get over how odd this is – not sure what Amazon was thinking with this 🙂 .  I think it’s odd because my first inclination is to treat the DeepLens like a first-class computer, meaning I have my keyboard, mouse and monitor connected to it. Why would I need to configure it from another computer over wifi? OMG so funny!!

Whither Went My DeepLens Wifi

So the wifi simply would not come on again, as life’s ironies often dictate. So my girlfriend and I went out to Best Buy in 20 degree weather (I bet your girlfriend wouldn’t do that) to buy a USB Hub and a USB-to-Ethernet connector, the idea being that if I could get the device online over ethernet, maybe I could configure this thing that way. Using a hard-wired ethernet connection, my DeepLens was back online, but now with an IP Address of The instructions say to connect to your device console using Being the contrarian that I am, I tried to connect using – yeah, no dice.  In fact, I could not even find anything running on port 80 of the device at this point. What had I done?!?

Save me USB, you’re my only hope!

AWS guys, I’d totally put an ethernet port in the back of this device.

After poking around a bit, I found the awscam software in /opt/awscam. It looks to me like the DeepLens console is just a nodejs app that is served by some python scripts in the daemon-scripts directory. And wouldn’t you know, those scripts are hard-coded to bind the nodejs app to the wifi device and to run on port I’m dying here. Ok, so I either have to figure out how to modify the daemon python scripts to use the eth0 device and bind to, or I have to get the on-board wifi working again.

Luckily, I saw a mention on the AWS forum about a possible Linux Kernel incompatibility with the DeepLens wifi hardware, so I decided to try the path of getting the wifi hardware working again by reverting to an older Linux Kernel, if one even existed – I didn’t know at this point. The following video got me over the hardest piece of solving how to boot an older Ubuntu Kernel:

The GRUB Loader does not display upon reboot in the DeepLense by default, so my first step was to get the GRUB Menu to show:

  1. edit /etc/default/grub
  2. comment out GRUB_HIDDEN_TIMEOUT
  3. set GRUB_TIMEOUT=-1
  4. sudo update-grub (or do it as root)
  5. reboot

Once the device reboots you will finally see the GRUB menu – fantastic!  Select advanced settings, then select the 4.10.17+ kernel.  Once rebooted, the on-board wifi should be working again and the little blinky middle light should be happy again.  Now you should be back on track to register your device per the AWS instructions.  And if you ever need the happy blinking middle wifi light again, the setup pin hole in the back of the camera should work as long as you are running the correct Linux Kernel.

I’m not positive the kernel is the problem, but I am positive these steps worked for me.  And how did I get kernel 4.13.0-26-generic installed in the first place?  I’m not even sure.  I did try to update my device, and maybe that was the start of the problem?  I’m not sure.

Anyway, I am now able to download the pre-built Face-Detection Project to my device, as seen here:

DeepLens face detection

At this rate, it’s doubtful I’ll get anything built by the hackathon deadline, but it’s kind of fun messing with the hardware.

This Armin Van Buuren Ibiza set is so tight.  Love it, especially around minute 40!

Barbarians Inside the Gates: AWS Security Roadshow

AWS Security Roadshow, Tysons Corner, VA (5/23/2017)

I attended the AWS Security Roadshow yesterday in Tysons Corner, VA (5/23/2017).  Members of the AWS Technical Services Team delivered various briefings and answered one-on-questions regarding best practices for securing one’s AWS Cloud-Based Software Solutions.  One of my biggest take-a-ways was the idea of ‘DevSecOps’.

The software development life cycle (SDLC) is typically a process balanced by two competing forces: Development and Operational Staff.  The Development Staff is typically motivated by the imperative to deliver quality code quickly and often, while the Operations Staff is typically motivated by the imperative to keep the Production Environment running and stable, with as few changes as possible.  AWS are encouraging users of their platform to include a third competing component in the typical SDLC: Security Staff.

Security Staff, the ‘Sec’ in the term ‘DevSecOps’, are motivated by the imperative to keep the bad guys away from Enterprise Data, promising to make the balancing act between Development and Operational imperatives even more contentious, albeit a necessary contention at that.  Security Engineers need to be integral components of any Enterprise Software Engineer Team, and they need to be driving Security concerns and architectural decisions from the very beginning of the SDLC.  Computer Security is not a quality gate, but an integral part of the SDLC.

Security Inconsistencies

While overall I am impressed by AWS’ focus on Cloud Security, and their desire to ensure that AWS customers practice ‘Safe OpSec’ (Safe Operational Security, for you AFN Fans) on their platform, I have noticed a few inconsistencies in the overall security messaging:

Practicing Safe OpSec Costs More

Keeping technical assets secure in the AWS Cloud costs more.  For example, if you want to keep your Lambda function safe from the wily internet behind a Virtual Public Cloud (VPC), the VPC is going to cost you.  Moreover, if your Lambda function, running safely on your VPC subnet, needs to access the public network for anything, like to access SES to send out an email notification, your VPC will need to be attached to a NAT to forward internet bound requests out through an Internet Gateway.  The NAT/Gateway implementation is also going to cost you.  So, in reality (and this may matter quite a bit to bootstrapped startups using AWS), it will cost a customer significantly more to secure their cloud-based solution than not.

Even Ehrlich Bachman and his ‘See Food’ startup express angst over AWS charges…

Penetration Testing Can Get You In Trouble

The AWS Staff encouraged participants at this particular Road Show gathering to automate security testing, and penetration testing in particular, into the CI/CD code build and deployment pipeline.  However, penetration testing, in someone else’s cloud infrastructure, can land you in hot water.  You need to be sure to read the law of the land on this issue, and request permission to pen-test from AWS (  From a newbie customer’s perspective, these instructions seem a bit ominous and could deter folks from even bothering.

Alexa Skill Security

I asked one AWS Engineer some questions about Alexa Security and how Alexa might be securely utilized in the Enterprise.  The engineer I asked was not an Alexa engineer, so agreed to forward my question to the Alexa Engineering Staff.  I have not heard anything back yet on my questions, but I suspect IT security and Alexa Skills have yet to meet one another.

Think Like A Barbarian

I am impressed that AWS is concerned enough about sharing security concerns with their customers that they are traveling around the United States to help ensure that IT security remains a primary concern.  AWS have a vested interest in customers who are well educated on AWS Cloud services and security best practices.  Their message is clear: when deploying applications to the AWS infrastructure, think like a Black Hat and use AWS services and best practices to help protect your assets.  As more and more organizations move to AWS, IT Security becomes increasingly important for the growing universe of AWS Cloud Customers.

EDM Love

When I was in 6th grade, I discovered one of my Dad’s records (what the heck is a ‘record’?) called ‘Switched-On Bach’. I gave it a listen and was ABSOLUTELY mesmerized. I had never heard anything like it! The album was full of J.S. Bach music played on a Moog Synthesizer by Wendy Carlos. According to Wikipedia, this album placed in the top 10 on the US Billboard 200 between 1969 and 1972! The combination of old school Classical Music being played on a high tech instrument like the Moog Synthesizer was a blissful combination to me.

Years later I fell in love with the movie, ‘Tron’. The Tron soundtrack was authored and played by my idle from the ‘Switched-On Bach’ days, Wendy Carlos. It was about this time (I was a freshman in High School), that I started to get what an awesome combination computers and music made.

Since then, I’ve tinkered with digital music. I’ve written a few goofy songs. I’ve tried my hand at producing a few songs on Garage Band. I’ve dreamed of becoming a DJ, even talked to a few DJs about how to go about it. But I’ve never pursued this passion much further than that. I’ve been called a ‘fart in a frying pan’ because I chase alot of different dreams. I do need a bit of focus in my life…Anyway, two years ago I took my girlfriend to U Street in DC once to try to get a feel for the DJ scene in Washington DC. The main DJ was Afrika Bambaattaa, one of my faves in High School. We had an awesome time, but the vibe wasn’t really what I was looking for. U Street is no Ushuaia.

One thing on my bucket list is to party in Ibiza

Fast forward to 2016. Never mind how old I am now. Not important. This past November, I had the privilege of attending a Martin Garrix performance in Las Vegas with my girlfriend. We were blown away by the richness of the bass, the visual sensations, and the overall experience. The Martin Garrix performance was at the AWS re:Invent 2016 Cloud Conference at the re:Play party. I felt I had been reconnected with my childhood fascination with computers and music. It was SENSATIONAL!!!

AWS re:Play Party 2016

Fast forward to today. Where is the EDM scene? Where are the best DJs in the world? Where can we go to hear them and experience the vibe? Turns out one of the best EDM DJ experiences in the world is in Boom, Brussels at Tomorrowland. So guess where we are going this July?!?!?! Headliners currently include most of my favorites, including Martin Garrix, Armin van Buuren, Afrojack, Kaskade, Steve Aoki, etc. I’m hoping that David Guetta, Dmitri Vegas and Like Mike are also there for the Weekend 2 performances – BE THERE OR BE SQUARE!! Tickets have already sold out. We are so stoked!!!!

If you are going to Boom this summer, drop me a line.

Building A Startup On AWS

Let’s Dance

Building on the knowledge learned from my previous two blog posts on my following of the Wild Rydes AWS Serverless Computing Tutorials, ( Wild Rydes Part I and Part Deux), I decided to put some of that information to use in my own work at

I’ve been working on some mobile apps and a back-end platform supporting my trans-Atlantic Ocean Rowing attempt last year with my girlfriend, Cindy. I’d like to turn some of the things I’ve developed thus far into a Software as a Service (SaaS) for other people to easily use on similar adventures. To that end, I wanted to quickly create a responsive website to put out some information about my future offerings, including the ability to allow interested parties to contact me by providing their email address and a contact message in a simple contact form.

Know Your Limitations. Build On the Shoulders of Giants

I know I do not have great web design skills. Web Design is just not my focus. But I needed to create a nice looking website for my startup landing page. What to do? I did some quick searches and found lots of free Bootstrap templates I could use for my purposes. Over the course of an afternoon I grabbed a free Bootstrap Template that I liked, cut-in some of my own images, and modified the html to create the menus and sections I wanted in my landing page. I brought in some of the JavaScript from the Wild Rydes tutorial I was working through to connect my Contact Form to my DynamoDB database running in my AWS Account. After I had a look-and-feel I was going for, and the functionality was working ok for the Contact Form, it was simply a matter of uploading my web site assets to my S3 bucket:

> aws s3 sync . s3://

Stop Daddy

I had previously registered my Domain Name ( with GoDaddy last year. Now I wanted to move the DNS Registrar to AWS. This turned out to be very easy. Once I followed the documented steps to move a domain to AWS, I only had to add an A Record to point to the domain to my S3 Bucket containing website artifacts. I will point this A Record to a CloudFront endpoint soon.

Lipstick On A Pig

Now that the landing page is up, there is a mountain of work to do. The next step is to get email working for my domain using AWS SES so I can use that domain email to register as an organization in the Apple iOS Developer Program.

AWS Serverless Computing Example: Wild Rydes Part Deux

WildRydes Admin Interface

In my last blog post, I mentioned how I was working my way through @jpignata‘s excellent tutorial on GitHub on how to work with AWS Lambda Services, API Gateway, etc.  I work through the tutorial when I have a few minutes to spare and am finding it quite enjoyable.  AWS Lambda Services and the API Gateway are pretty fun and interesting to work with.

In Lab 3 of the tutorial we create an Admin Interface to allow authenticated users the ability to view the email addresses that have been added to the DynamoDB database.  Admin Users are authenticated by the Admin Interface against a Cognito User Pool.  This lab was pretty straight-forward as I did not fat-finger as any mistakes this time.  However, I did take note of the following:

API Gateway URL Notes:

  • Make sure to ‘Deploy API’ after you make changes to your API Gateway.  Many times I thought my configuration updates simply weren’t correct, when in fact I had simply forgotten to deploy the updates to ‘prod’.
  • Simply adding ‘Authorizers’ on your API Gateway is not sufficient for protecting the URL endpoint.  You have to also add the Authorizer to the Method Request ‘Authorization’ of the URL endpoint.  I found that my API endpoints were not protected until I remembered to do this step:

Lambda CI/CD Pipelines

As I am slowly working up to doing something more substantial with Lambda Services, I am curious how one might integrate Lambda Serverless Code into a CI/CD Pipeline.  It seems you can use Gulp/Grunt with the gulp-awslambda plugin ( to accomplish this.  I need to to try this out.

My Admin screen is available on CloudFront, but you can’t log in.

My API Gateway Endpoint is publicly available as well, but it should be protected against unauthenticated users:

What a great tutorial!!  One more Lab to finish and I’ll hopefully be off building something real…

Some Randomness : ‘The Black Bear’

My girlfriend and I have been watching ‘Black Mirror’ on Netflix occasionally.  Last night, we watched the ‘White Bear’ episode, which freaked me out, as most episodes do.  But, it also made me think of one of my favorite Bagpipe Tunes, ‘The Black Bear’.  Hear some renditions to make your cubicle-bound blood start pumping:

Paaaaaaaassssssss. In. Revieeeeeeeeeewwwwwwwwwwwwwwwwww!!!!!

AWS Serveless Computing Example: Wild Rydes Part I

I’ve been working through a tutorial I started in a session I took at AWS re:Invent 2016.  I did not finish the tutorial in class so I started working on it again after getting home from the conference.  The tutorial is on GitHub if you care to follow along.

Admittedly, I’m not the sharpest knife in the drawer(but I am made of the hardest, most persistent steel…they call me, ‘Blue Steel’ – said in my best Ben Stiller voice).  It took me a while to figure out why I could not get the AWS Javascript SDK to allow unauthorized users, vis-a-vis AWS Cognito, to access my DynamoDB Email Table.  Here are some errors and things I learned troubleshooting this:

The latest Firefox browser seems to give better clues about why things are not working in the Developer Console than Google Chrome.  Using Google Chrome, I kept seeing an error like, “Missing Credentials In Config”, and was really confused what exactly that meant.  I was following the tutorial exactly, as far as I could tell, so I could not discern whether this error was from a code change I made or an AWS configuration problem?  Then I looked at my website in Firefox, using the Firefox Developer Console, and could see a little bit better what was going on.

Here’s my main error as seen in the Google Chrome Developer Console:

And here’s the same error as reported by Firefox Developer Console:

Ahh!  So a ‘ResourceNotFoundException’ is being thrown.  Now I could see that my Javascript code probably wasn’t the problem and that my Cognito/IAM Role Configuration might be the culprit.

After further investigation..a day (or so) later…I discovered a simple typo in my DynamoDB Table Name:

The table name should have been ‘Wildrydes_Emails’.  Seriously?!?!  Yes, I’m an idiot (but one made of ‘Blue Steel’…).  Once that was corrected, I was finally able to get my unauthenticated Cognito Role to access my DynamoDB Table.

There is still work to be done in this tutorial, and I’ll blog about any issues I overcome as I encounter them.  My work is being hosted in my AWS account on Cloudfront, so feel free to check it out and submit your email to my DynamoDB database.  Let’s get this startup rolling!