AWS

SQS (S3 Event) Lambda Trigger

I ran into a little issue today parsing a S3 SQS event that was sent to Lambda via a SQS trigger.  I assumed the incoming event to Lambda was 100% of type dict.  Given this, I assumed I could pull the bucket name and key using this syntax.

bucketname = event['Records'][0]['body']['Records'][0]['s3']['bucket']['name']
objectname = event['Records'][0]['body']['Records'][0]['s3']['object']['key']

As it turns out the incoming event is not 100% of type dict and I got the following error.  

string indices must be integers

The Records after the body ([‘Records’][0][‘body’]) are of type str.  Below is the updated code to pull the bucket name and key from the incoming event.

event_body_records_string = event['Records'][0]['body']
event_body_records_dict = json.loads(event_body_records_string)

bucketname = event_body_records_dict['Records'][0]['s3']['bucaket']['name']
objectname = event_body_records_dict['Records'][0]['s3']['object']['key']

Now everything works out great!!!

python

AWS SQS – receive_message

When using the “receive_message” Python Boto function to pull message(s) from a SQS queue, you will always get a response back when the command completes.  However, how do you determine if the response you got back actually contains a valid message?

Quick trick:

response = sqs.receive_message
if 'Messages' in response:
    print("Message on the queue to process")
else:
    print("No messages on the queue to process")

Thats about it!!

python

AWS Certified SysOps Administrator – Associate

Its time to study for AWS Certification #3.  I took a little time off, but no more!  This time I am going for the SysOps Administrator – Associate certificate.  I guess its a little harder than the other 2 certifications I passed, but still within range!

Whats hard about some of these certifications is you don’t actually work with all covered material day-to-day.  Spinning up EC2 instances and creating security groups it pretty standard, but when it comes to things like networking, those tasks are not typically done by me.

So my plan is going to be the following to pass the exam:

  • Listen to a training course from acloud.guru on the way to and from work. (Long commute both ways)
  • Practice heavily in AWS on all the relevant topics.
  • Use the practice exam voucher I received from passing previous certifications to pay for an official AWS SysOps practice exam.
  • Rely on personal experience.
  • Cross-fingers!!!

AWS_Certified_Logo_294x230_Color

AWS SAA Certification Prep – 2018

I’m getting ready to take the AWS Solutions Architect Associate 2018 test.  Below are some final items I need to review before the exam.

AWS FAQ’s:

Specific Items:

  • Spot vs Spot Block
  • Application ELB vs Classic ELB
  • Convertible vs. Standard Reserved Instances
  • EBS Cost vs Performance
  • EC2 reverse proxy (link) (link)
  • Glacier Retrieval Options
  • Beanstalk vs. NGINX
  • Cross-region snapshots for databases

AWS Developer Certification – My Plan

Below is my plan to obtain the AWS Developer Certification.

For each of the following areas listed further down, I am trying to do the following:

  • Read the FAQ’s
  • Practice in the console
  • Practice with the CLI and understand the functions\parameters
  • Review all HTTP codes
  • Review all defaults and limits
  • Review uniqueness of each area

Here are the areas I am covering in preparation for the exam.

  • EC2
  • S3
  • DynamoDB
  • SNS
  • SQS
  • VPC
  • ELB
  • Lambda
  • Route 53
  • RDS
  • SWF
  • Cloudformation
  • Elastic Beanstalk
  • API Gateway
  • Storage Gateway
  • EFS
  • CloudWatch
  • CloudTrail
  • IAM

The exam is only 55 questions, so i’m not sure how in depth the exam will go on each of these.  Regardless, its a good to review all of the areas!

AWS Consistency Models

S3 Consistency Model

  • Puts (New record) = Read-after-write consistency model
  • Updates and Deletes = Eventual consistency model

DynamoDB Consistency Model

  • Write = Eventual consistency model
  • Read = Eventual consistency model
  • Read  = Optional – Strong consistency model

Definitions

  • Read-after-write consistency model = New objects should be available with out delays to clients.
  • Eventual consistency model = “Eventually” all access attempts to a particular item will return the last updated value.  There is potential here for stale or old data reads while data replication occurs.
  • Strong consistency model = All access attempts (e.g. parallel) to a particular item return the same unique state.  Old\stale data reads are avoided, but it will cost you more.

Good information to know for any AWS certification tests…. 🙂

AmazonWebservices_Logo.svg_