SQS (S3 Event) Lambda Trigger

I ran into a little issue today parsing a S3 SQS event that was sent to Lambda via a SQS trigger.  I assumed the incoming event to Lambda was 100% of type dict.  Given this, I assumed I could pull the bucket name and key using this syntax.

bucketname = event['Records'][0]['body']['Records'][0]['s3']['bucket']['name']
objectname = event['Records'][0]['body']['Records'][0]['s3']['object']['key']

As it turns out the incoming event is not 100% of type dict and I got the following error.  

string indices must be integers

The Records after the body ([‘Records’][0][‘body’]) are of type str.  Below is the updated code to pull the bucket name and key from the incoming event.

event_body_records_string = event['Records'][0]['body']
event_body_records_dict = json.loads(event_body_records_string)

bucketname = event_body_records_dict['Records'][0]['s3']['bucaket']['name']
objectname = event_body_records_dict['Records'][0]['s3']['object']['key']

Now everything works out great!!!

python

AWS SQS – receive_message

When using the “receive_message” Python Boto function to pull message(s) from a SQS queue, you will always get a response back when the command completes.  However, how do you determine if the response you got back actually contains a valid message?

Quick trick:

response = sqs.receive_message
if 'Messages' in response:
    print("Message on the queue to process")
else:
    print("No messages on the queue to process")

Thats about it!!

python

ANT – JUNIT – PARALLEL

Looking to speed up your builds by running your Junit tests in parallel with Ant?  With Ant 1.9.4, the Junit task now supports the “threads” attribute.  The default value is “1”, but can optionally be changed to the number of test threads desired that will be used for parallel test execution.

Note, when using this new attribute, you must set your “forkmode” equal to “perTest” and if you are upgrading to 1.10.x, Java 8 run-time is required.

faster.png

AWS Certified SysOps Administrator – Associate

Its time to study for AWS Certification #3.  I took a little time off, but no more!  This time I am going for the SysOps Administrator – Associate certificate.  I guess its a little harder than the other 2 certifications I passed, but still within range!

Whats hard about some of these certifications is you don’t actually work with all covered material day-to-day.  Spinning up EC2 instances and creating security groups it pretty standard, but when it comes to things like networking, those tasks are not typically done by me.

So my plan is going to be the following to pass the exam:

  • Listen to a training course from acloud.guru on the way to and from work. (Long commute both ways)
  • Practice heavily in AWS on all the relevant topics.
  • Use the practice exam voucher I received from passing previous certifications to pay for an official AWS SysOps practice exam.
  • Rely on personal experience.
  • Cross-fingers!!!

AWS_Certified_Logo_294x230_Color