Skip to main content

[DAA-B01] SnowPro Advanced: Data Analyst Beta Exam Guide

The SnowPro Advanced: Data Analyst Certification Beta Exam will validate your advanced knowledge and skills to apply comprehensive data analysis principles using Snowflake and its components.

This certification will test your ability to:
- Prepare and load data
- Perform simple data transformations for data analysis
- Build and troubleshoot advanced SQL queries in Snowflake
- Use Snowflake built-in functions and create user-defined functions
- Perform descriptive and diagnostic data analyses
- Perform data forecasting
- Prepare and present data to meet business requirements

  • Course Number

  • Self-Paced

SnowPro™ Advanced: Data Analyst Certification Beta Candidate

1+ year of Snowflake data cloud analytics experience, including practical, hands-on use of the Snowflake Cloud Data Platform. The candidate must hold the SnowPro Core certification in good standing, and should have fluency with advanced SQL.

Knowledge of an additional computer language is recommended but not required.

Target Audience:

  • Snowflake Data Analysts
  • ELT Developers
  • BI Specialists

Exam Format

Exam Version: DAA-BO1
Total Number of Questions: 70
Question Types: Multiple Select, Multiple Choice
Time Limit: 115 minutes
Languages: English
Registration Fee: $175 USD ($200 off the regular exam price)
Scoring: Scaled score of 750+ on 0-1000 scale. Beta scores will be emailed approximately 10 weeks after the beta exam window closes.
Prerequisites: SnowPro Core Certified
Delivery Options:

  • Online Proctoring
  • Onsite Testing Centers

Find more about registration details here. The exam will only be offered through Pearson VUE.

Exam Domain Breakdown

This exam guide includes test domains, weightings, and objectives. It is not a comprehensive listing of all the content that will be presented on this examination. The table below lists the main content domains and their weighting ranges.

Domain 1.0: Data Ingestion and Data Preparation

1.1 Use a collection system to retrieve data.

  • Assess how often data needs to be collected
  • Identify the volume of data to be collected
  • Identify data sources
  • Retrieve data from a source

1.2 Perform data discovery to identify what is needed from the available datasets.

  • Query tables in Snowflake
  • Evaluate which transformations are required

1.3 Enrich data by identifying and accessing relevant data from the Snowflake Marketplace.

  • Find external data sets that correlate with available data
  • Use data shares to join data with existing data sets
  • Create tables, views, etc.

1.4 Outline and use best practice considerations relating to data integrity structures.

  • Primary keys for tables
  • Perform table joins between parent/child tables
  • Constraints

1.5 Implement data processing solutions.

  • Aggregate and enrich data
  • Automate and implement data processing
  • Respond to processing failurest
  • Use logging and monitoring solutions

1.6 Given a scenario, prepare data and load into Snowflake.

  • Load files
  • Load data from external/internal stages into a Snowflake table
  • Load different types of data
  • Perform general DML (insert, update, delete)
  • Manually add or delete data from a table
  • Identifying and resolving data import errors

1.7 Given a scenario, use Snowflake functions.

  • Scalar functions
  • Aggregate functions
  • Window functions
  • Table functions
  • System functions

Domain 2.0: Data Transformation and Data Modeling

2.1 Prepare different data types into a consumable format.

  • CSV
  • JSON (query and parse)
  • Parquet

2.2 Given a dataset, clean the data.

  • Identify and analyze data anomalies
  • Remove corrupted data
  • Remove duplicates
  • Remove nulls
  • Validate data types
  • Use clones as required by specific use-cases

2.3 Given a dataset or scenario, work with and query the data.

  • Aggregate and validate the data
  • Apply analytic functions
  • Perform pre-math calculations (examples, randomization, ranking, grouping, min/max)
  • Perform classifications
  • Perform casting - change data types to ensure data can be presented consistently
  • Enrich the data
  • Use cartesian joins, sub-queries, CTEs, and union queries
  • Work with hierarchical data
  • Use sampling, approximation, and estimation features
  • Use Time Travel and cloning features
  • Use different data source formats (for example, structured, semi-structured, etc.)
  • Support native data types
  • Perform SQL operations (for example, grouping, sorting, etc.)

2.4 Use data modeling to manipulate the data to meet BI requirements.

  • Use a dimensional model
  • Compare the use of a dimensional model to the use of a flattened data set
  • Use modeling techniques for the consumption layer

2.5 Optimize query performance.

  • Use the Query Profile
  • Troubleshoot query performance
  • Use the result, metadata, and warehouse caching
  • Use of different types of tables, objects, and views

Domain 3.0: Data Analysis

3.1 Use SQL extensibility features.

  • User-Defined Functions (UDFs)
  • Stored procedures
  • Regular, secured, and materialized views

3.2 Perform a descriptive analysis.

  • Summarize large data sets using Snowsight dashboards
  • Perform exploratory ad-hoc analyses

3.3 Perform a diagnostic analysis.

  • Find reasons/causes of anomalies or patterns in historical data
  • Collect related data
  • Identify demographics and relationships
  • Analyze statistics and trends

3.4 Perform forecasting.

  • Use statistics and built in functions
  • Make predictions based on data

Domain 4.0: Data Presentation and Data Visualization

4.1 Given a use case, create reports and dashboards to meet business requirements.

  • Evaluate and select the data for building dashboards
  • Compare and contrast different chart types (for example, bar charts, scatter plots, heat grids, scorecards)
  • Connect BI tools to Snowflake
  • Create charts and dashboards in Snowsight

4.2 Given a use case, maintain reports and dashboards to meet business requirements.

  • Build automated and repeatable tasks
  • Operationalize data
  • Store and update data
  • Manage and share Snowsight dashboards
  • Sorting and filtering
  • Use and naming of tiles
  • Store and update data
  • Configure subscriptions and updates

4.3 Given a use case, incorporate visualizations for dashboards and reports.

  • Present data for business use analyses
  • Identify patterns and trends
  • Identify correlations among variables
  • Customize data presentations using filtering and editing techniques

Recommended Training

We recommend that you have at least 1+ year of hands-on Snowflake practitioner experience in a Data Analyst role prior to attempting this exam. The exam will assess skills through scenario-based questions and real-world examples. As preparation for this exam, we recommend a combination of hands-on experience, instructor-led training, and the use of self-study assets.

Instructor-Led Course recommended for this exam:
Snowflake Data Analyst Training

Free Self Study recommended for this exam:
SnowPro Advanced: Data Analyst Study Guide