Share:

As well as having the AWS Machine Learning Partner Competency via North America, Cloudreach has been busy growing its EMEA team and capability in Big Data, Analytics (as well as ML). Our Cloud Data Architects help customers unlock business value from the top down, with a use case based methodology to refine Data Architecture and solve their Data Governance needs. Our Cloud Data Engineers provide a more technical bottom-up approach via rapid PoC prototyping through to production ready Data Platform design and build. The team now has 16 AWS certificates and counting !

General thoughts

The exam guide can be found here. I have added my thoughts against each domain:

Domain 1: Data Engineering (20%)

  • You should be comfortable with common data ingest/storage/transform patterns on AWS for batch and streaming data.
  • Know when to use different data/analytics services and how they fit together as part of an overall solution.
  • This section isn’t too machine learning heavy. I had already passed the AWS Big Data Speciality certification and felt there was some overlap.

Domain 2:  Exploratory Data Analysis (24%)

  • You should be aware of techniques such as: categorical encoding, feature engineering and handling missing data.
  • Understand the common challenges when preparing data for machine learning solutions and possible resolutions.
  • How visualising data can be used to spot trends/anomalies in data.

Domain 3: Modeling (36%)

  • Understand how to: select, train, optimise and evaluate machine learning models for real-world scenarios.
  • Know the SageMaker built-in algorithms and also how to use custom frameworks within SageMaker e.g. TensorFlow, MXNet.
  • Experience with evaluating ML models (precision, recall, F1 score etc).
  • At a high-level, understand what each of the AWS ML services do (Rekognition, Transcribe, Translate etc).

Domain 4: Machine Learning Implementation and Operations (20%)

  • Understand how to: deploy, operationalise and secure your machine learning solutions.
  • You should be comfortable with how SageMaker interacts with: VPC, IAM, S3, security policies and containers.
  • Know how to scale and configure SageMaker endpoints.

Learning resources

  • A Cloud Guru course – The AWS Certified Machine Learning – Speciality course gives a very good overview of each of the domains covered in the certification. Note: this course was still in preview and not 100% complete at the time of writing this blog.
  • AWS SageMaker documentation – You can handpick the key modules to read e.g. Build, Train, Deploy, ML frameworks with SageMaker, Authentication, Access Controls, Monitoring and Security. I found the documentation very useful.
  • Google Developer Machine Learning Crash Course – The ‘ML Concepts’ section is generic and covers ML fundamentals. Very good content with some practical demonstrations.
  • Evaluate model metrics – A good blog which covers model evaluation techniques and when to use them.
  • Official AWS practice exam – Official AWS practice exam (needs to be purchased). I found this to be a good indication of exam readiness.
  • Official AWS sample questions – Official AWS sample questions with answers (free).

Extra reading

The below reading is for completeness and high-level understanding should be sufficient. It might help to rule out some correct/incorrect answers.

Tips for exam day

Once you feel ready to take the exam, go ahead and book it! This can be done on the AWS Training & Certification website. A couple of things to keep in mind for the day:

  • This was a ~3-hour exam. I had sufficient time to return to flagged questions and take a second cursory check over all questions.
  • Try to tease out what the question is testing you on e.g. model accuracy or reducing overfitting. This will help you avoid the ‘distractor’ answers.
  • If the answer isn’t obvious, I usually eliminate incorrect options first and then deduce from there.

Best of luck on your own certification journey.