SnowPro Advanced Architect Topic 2
Question #: 101
Topic #: 1
A company needs to share its product catalog data with one of its partners. The product catalog data is stored in two database tables: PRODUCT_CATEGORY, and PRODUCT_DETAILS. Both tables can be joined by the PRODUCT_ID column. Data access should be governed, and only the partner should have access to the records.
The partner is not a Snowflake customer. The partner uses Amazon S3 for cloud storage.
Which design will be the MOST cost-effective and secure, while using the required Snowflake features?
A. Use Secure Data Sharing with an S3 bucket as a destination.
B. Publish PRODUCT_CATEGORY and PRODUCT_DETAILS data sets on the Snowflake Marketplace.
C. Create a database user for the partner and give them access to the required data sets.
D. Create a reader account for the partner and share the data sets as secure views.
Selected Answer: B
Question #: 95
Topic #: 1
What is a key consideration when setting up search optimization service for a table?
A. Search optimization service works best with a column that has a minimum of 100 K distinct values.
B. Search optimization service can significantly improve query performance on partitioned external tables.
C. Search optimization service can help to optimize storage usage by compressing the data into a GZIP format.
D. The table must be clustered with a key having multiple columns for effective search optimization.
Selected Answer: D
Question #: 93
Topic #: 1
What step will improve the performance of queries executed against an external table?
A. Partition the external table.
B. Shorten the names of the source files.
C. Convert the source files’ character encoding to UTF-8.
D. Use an internal stage instead of an external stage to store the source files.
Selected Answer: D
Question #: 75
Topic #: 1
How can the Snowpipe REST API be used to keep a log of data load history?
A. Call insertReport every 20 minutes, fetching the last 10,000 entries.
B. Call loadHistoryScan every minute for the maximum time range.
C. Call insertReport every 8 minutes for a 10-minute time range.
D. Call loadHistoryScan every 10 minutes for a 15-minute time range.
Selected Answer: D
Question #: 51
Topic #: 1
A Snowflake Architect is designing a multi-tenant application strategy for an organization in the Snowflake Data Cloud and is considering using an Account Per Tenant strategy.
Which requirements will be addressed with this approach? (Choose two.)
A. There needs to be fewer objects per tenant.
B. Security and Role-Based Access Control (RBAC) policies must be simple to configure.
C. Compute costs must be optimized.
D. Tenant data shape may be unique per tenant.
E. Storage costs must be optimized.
Selected Answer: BD
Question #: 47
Topic #: 1
What Snowflake features should be leveraged when modeling using Data Vault?
A. Snowflake’s support of multi-table inserts into the data model’s Data Vault tables
B. Data needs to be pre-partitioned to obtain a superior data access performance
C. Scaling up the virtual warehouses will support parallel processing of new source loads
D. Snowflake’s ability to hash keys so that hash key joins can run faster than integer joins
Selected Answer: A
Question #: 10
Topic #: 1
Which organization-related tasks can be performed by the ORGADMIN role? (Choose three.)
A. Changing the name of the organization
B. Creating an account
C. Viewing a list of organization accounts
D. Changing the name of an account
E. Deleting an account
F. Enabling the replication of a database
Selected Answer: BCF
Question #: 79
Topic #: 1
What are characteristics of the use of transactions in Snowflake? (Choose two.)
A. Explicit transactions can contain DDL, DML, and query statements.
B. The AUTOCOMMIT setting can be changed inside a stored procedure.
C. A transaction can be started explicitly by executing a BEGIN WORK statement and end explicitly by executing a COMMIT WORK statement.
D. A transaction can be started explicitly by executing a BEGIN TRANSACTION statement and end explicitly by executing an END TRANSACTION statement.
E. Explicit transactions should contain only DML statements and query statements. All DDL statements implicitly commit active transactions.
Selected Answer: B
Question #: 76
Topic #: 1
The diagram shows the process flow for Snowpipe auto-ingest with Amazon Simple Notification Service (SNS) with the following steps:
Step 1: Data files are loaded in a stage.
Step 2: An Amazon S3 event notification, published by SNS, informs Snowpipe — by way of Amazon Simple Queue Service (SQS) – that files are ready to load. Snowpipe copies the files into a queue.
Step 3: A Snowflake-provided virtual warehouse loads data from the queued files into the target table based on parameters defined in the specified pipe.
If an AWS Administrator accidentally deletes the SQS subscription to the SNS topic in Step 2, what will happen to the pipe that references the topic to receive event messages from Amazon
S3?
A. The pipe will continue to receive the messages as Snowflake will automatically restore the subscription to the same SNS topic and will recreate the pipe by specifying the same SNS topic name in the pipe definition.
B. The pipe will no longer be able to receive the messages and the user must wait for 24 hours from the time when the SNS topic subscription was deleted. Pipe recreation is not required as the pipe will reuse the same subscription to the existing SNS topic after 24 hours.
C. The pipe will continue to receive the messages as Snowflake will automatically restore the subscription by creating a new SNS topic. Snowflake will then recreate the pipe by specifying the new SNS topic name in the pipe definition.
D. The pipe will no longer be able to receive the messages. To restore the system immediately, the user needs to manually create a new SNS topic with a different name and then recreate the pipe by specifying the new SNS topic name in the pipe definition.
Selected Answer: B
Question #: 73
Topic #: 1
Which Snowflake objects can be used in a data share? (Choose two.)
A. Standard view
B. Secure view
C. Stored procedure
D. External table
E. Stream
Selected Answer: C
Question #: 69
Topic #: 1
Is it possible for a data provider account with a Snowflake Business Critical edition to share data with an Enterprise edition data consumer account?
A. A Business Critical account cannot be a data sharing provider to an Enterprise consumer. Any consumer accounts must also be Business Critical.
B. If a user in the provider account with role authority to CREATE or ALTER SHARE adds an Enterprise account as a consumer, it can import the share.
C. If a user in the provider account with a share owning role sets SHARE_RESTRICTIONS to False when adding an Enterprise consumer account, it can import the share.
D. If a user in the provider account with a share owning role which also has OVERRIDE SHARE RESTRICTIONS privilege SHARE_RESTRICTIONS set to False when adding an Enterprise consumer account, it can import the share.
Selected Answer: C
Question #: 67
Topic #: 1
An Architect clones a database and all of its objects, including tasks. After the cloning, the tasks stop running.
Why is this occurring?
A. Tasks cannot be cloned.
B. The objects that the tasks reference are not fully qualified.
C. Cloned tasks are suspended by default and must be manually resumed.
D. The Architect has insufficient privileges to alter tasks on the cloned database.
Selected Answer: B
Question #: 66
Topic #: 1
Which SQL ALTER command will MAXIMIZE memory and compute resources for a Snowpark stored procedure when executed on the snowpark_opt_wh warehouse?
A. alter warehouse snowpark_opt_wh set max_concurrency_level = 1;
B. alter warehouse snowpark_opt_wh set max_concurrency_level = 2;
C. alter warehouse snowpark_opt_wh set max_concurrency_level = 8;
D. alter warehouse snowpark_opt_wh set max_concurrency_level = 16;
Selected Answer: D
Question #: 3
Topic #: 1
At which object type level can the APPLY MASKING POLICY, APPLY ROW ACCESS POLICY and APPLY SESSION POLICY privileges be granted?
A. Global
B. Database
C. Schema
D. Table
Selected Answer: A
Question #: 65
Topic #: 1
Which columns can be included in an external table schema? (Choose three.)
A. VALUE
B. METADATA$ROW_ID
C. METADATA$ISUPDATE
D. METADATA$FILENAME
E. METADATA$FILE_ROW_NUMBER
F. METADATA$EXTERNAL_TABLE_PARTITION
Selected Answer: D
Question #: 63
Topic #: 1
A group of Data Analysts have been granted the role ANALYST_ROLE. They need a Snowflake database where they can create and modify tables, views, and other objects to load with their own data. The Analysts should not have the ability to give other Snowflake users outside of their role access to this data.
How should these requirements be met?
A. Grant ANALYST_ROLE OWNERSHIP on the database, but make sure that ANALYST_ROLE does not have the MANAGE GRANTS privilege on the account.
B. Grant SYSADMIN OWNERSHIP of the database, but grant the create schema privilege on the database to the ANALYST_ROLE.
C. Make every schema in the database a MANAGED ACCESS schema, owned by SYSADMIN, and grant create privileges on each schema to the ANALYST_ROLE for each type of object that needs to be created.
D. Grant ANALYST_ROLE OWNERSHIP on the database, but grant the OWNERSHIP ON FUTURE [object type]s in database privilege to SYSADMIN.
Selected Answer: B
Question #: 107
Topic #: 1
What is a characteristic of Role-Based Access Control (RBAC) as used in Snowflake?
A. Privileges can be granted at the database level and can be inherited by all underlying objects.
B. A user can use a “super-user” access along with SECURITYADMIN to bypass authorization checks and access all databases, schemas, and underlying objects.
C. A user can create managed access schemas to support future grants and ensure only schema owners can grant privileges to other roles.
D. A user can create managed access schemas to support current and future grants and ensure only object owners can grant privileges to other roles.
Selected Answer: D
Question #: 104
Topic #: 1
A Snowflake Architect created a new data share and would like to verify that only specific records in secure views are visible within the data share by the consumers.
What is the recommended way to validate data accessibility by the consumers?
A. Create reader accounts as shown below and impersonate the consumers by logging in with their credentials.
create managed account reader_acct1 admin_name = user1 , admin_password = ‘Sdfed43da!44’ , type = reader;
B. Create a row access policy as shown below and assign it to the data share.
create or replace row access policy rap_acct as (acct_id varchar) returns boolean -> case when ‘acct1_role’ = current_role() then true else false end;
C. Set the session parameter called SIMULATED_DATA_SHARING_CONSUMER as shown below in order to impersonate the consumer accounts.
alter session set simulated_data_sharing_consumer = ‘Consumer Acct1’
D. Alter the share settings as shown below, in order to impersonate a specific consumer account.
alter share sales_share set accounts = ‘Consumer1’ share_restrictions = true
Selected Answer: D
Question #: 102
Topic #: 1
A company has a Snowflake environment running in AWS us-west-2 (Oregon). The company needs to share data privately with a customer who is running their Snowflake environment in Azure East US 2 (Virginia).
What is the recommended sequence of operations that must be followed to meet this requirement?
A. 1. Create a share and add the database privileges to the share
2. Create a new listing on the Snowflake Marketplace
3. Alter the listing and add the share
4. Instruct the customer to subscribe to the listing on the Snowflake Marketplace
B. 1. Ask the customer to create a new Snowflake account in Azure EAST US 2 (Virginia)
2. Create a share and add the database privileges to the share
3. Alter the share and add the customer’s Snowflake account to the share
C. 1. Create a new Snowflake account in Azure East US 2 (Virginia)
2. Set up replication between AWS us-west-2 (Oregon) and Azure East US 2 (Virginia) for the database objects to be shared
3. Create a share and add the database privileges to the share
4. Alter the share and add the customer’s Snowflake account to the share
D. 1. Create a reader account in Azure East US 2 (Virginia)
2. Create a share and add the database privileges to the share
3. Add the reader account to the share
4. Share the reader account’s URL and credentials with the customer
Selected Answer: C
Question #: 91
Topic #: 1
What transformations are supported in the below SQL statement? (Choose three.)
CREATE PIPE … AS COPY … FROM (…)
A. Data can be filtered by an optional WHERE clause.
B. Columns can be reordered.
C. Columns can be omitted.
D. Type casts are supported.
E. Incoming data can be joined with other tables.
F. The ON_ERROR – ABORT_STATEMENT command can be used.
Selected Answer: D
Question #: 54
Topic #: 1
A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.
Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)
A. Use, at minimum, the Business Critical edition of Snowflake.
B. Create Dynamic Data Masking policies and apply them to columns that contain PHI.
C. Use the Internal Tokenization feature to obfuscate sensitive data.
D. Use the External Tokenization feature to obfuscate sensitive data.
E. Rewrite SQL queries to eliminate projections of PHI data based on current_role().
F. Avoid sharing data with partner organizations.
Selected Answer: C
Question #: 53
Topic #: 1
What are purposes for creating a storage integration? (Choose three.)
A. Control access to Snowflake data using a master encryption key that is maintained in the cloud provider’s key management service.
B. Store a generated identity and access management (IAM) entity for an external cloud provider regardless of the cloud provider that hosts the Snowflake account.
C. Support multiple external stages using one single Snowflake object.
D. Avoid supplying credentials when creating a stage or when loading or unloading data.
E. Create private VPC endpoints that allow direct, secure connectivity between VPCs without traversing the public internet.
F. Manage credentials from multiple cloud providers in one single Snowflake object.
Selected Answer: B
Question #: 14
Topic #: 1
The table contains five columns and it has millions of records. The cardinality distribution of the columns is shown below:
Column C4 and C5 are mostly used by SELECT queries in the GROUP BY and ORDER BY clauses. Whereas columns C1, C2 and C3 are heavily used in filter and join conditions of SELECT queries.
The Architect must design a clustering key for this table to improve the query performance.
Based on Snowflake recommendations, how should the clustering key columns be ordered while defining the multi-column clustering key?
A. C5, C4, C2
B. C3, C4, C5
C. C1, C3, C2
D. C2, C1, C3
Selected Answer: D
Question #: 5
Topic #: 1
A large manufacturing company runs a dozen individual Snowflake accounts across its business divisions. The company wants to increase the level of data sharing to support supply chain optimizations and increase its purchasing leverage with multiple vendors.
The company’s Snowflake Architects need to design a solution that would allow the business divisions to decide what to share, while minimizing the level of effort spent on configuration and management. Most of the company divisions use Snowflake accounts in the same cloud deployments with a few exceptions for European-based divisions.
According to Snowflake recommended best practice, how should these requirements be met?
A. Migrate the European accounts in the global region and manage shares in a connected graph architecture. Deploy a Data Exchange.
B. Deploy a Private Data Exchange in combination with data shares for the European accounts.
C. Deploy to the Snowflake Marketplace making sure that invoker_share() is used in all secure views.
D. Deploy a Private Data Exchange and use replication to allow European data shares in the Exchange.
Selected Answer: D
Question #: 17
Topic #: 1
How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)
A. Shared databases are read-only.
B. Shared databases must be refreshed in order for new data to be visible.
C. Shared databases cannot be cloned.
D. Shared databases are not supported by Time Travel.
E. Shared databases will have the PUBLIC or INFORMATION_SCHEMA schemas without explicitly granting these schemas to the share.
F. Shared databases can also be created as transient databases.
Selected Answer: ACE
Question #: 31
Topic #: 1
An Architect needs to allow a user to create a database from an inbound share.
To meet this requirement, the user’s role must have which privileges? (Choose two.)
A. IMPORT SHARE;
B. IMPORT PRIVILEGES;
C. CREATE DATABASE;
D. CREATE SHARE;
E. IMPORT DATABASE;
Selected Answer: AC
Question #: 29
Topic #: 1
An Architect on a new project has been asked to design an architecture that meets Snowflake security, compliance, and governance requirements as follows:
1. Use Tri-Secret Secure in Snowflake
2. Share some information stored in a view with another Snowflake customer
3. Hide portions of sensitive information from some columns
4. Use zero-copy cloning to refresh the non-production environment from the production environment
To meet these requirements, which design elements must be implemented? (Choose three.)
A. Define row access policies.
B. Use the Business Critical edition of Snowflake.
C. Create a secure view.
D. Use the Enterprise edition of Snowflake.
E. Use Dynamic Data Masking.
F. Create a materialized view.
Selected Answer: BCE
Question #: 12
Topic #: 1
The IT Security team has identified that there is an ongoing credential stuffing attack on many of their organization’s system.
What is the BEST way to find recent and ongoing login attempts to Snowflake?
A. Call the LOGIN_HISTORY Information Schema table function.
B. Query the LOGIN_HISTORY view in the ACCOUNT_USAGE schema in the SNOWFLAKE database.
C. View the History tab in the Snowflake UI and set up a filter for SQL text that contains the text “LOGIN”.
D. View the Users section in the Account tab in the Snowflake UI and review the last login column.
Selected Answer: A
Question #: 46
Topic #: 1
A company has several sites in different regions from which the company wants to ingest data.
Which of the following will enable this type of data ingestion?
A. The company must have a Snowflake account in each cloud region to be able to ingest data to that account.
B. The company must replicate data between Snowflake accounts.
C. The company should provision a reader account to each site and ingest the data through the reader accounts.
D. The company should use a storage integration for the external stage.
Selected Answer: D
Question #: 39
Topic #: 1
Which Snowflake data modeling approach is designed for BI queries?
A. 3 NF
B. Star schema
C. Data Vault
D. Snowflake schema
Selected Answer: B
Question #: 37
Topic #: 1
A company is using a Snowflake account in Azure. The account has SAML SSO set up using ADFS as a SCIM identity provider. To validate Private Link connectivity, an Architect performed the following steps:
Confirmed Private Link URLs are working by logging in with a username/password account
Verified DNS resolution by running nslookups against Private Link URLs
Validated connectivity using SnowCD
Disabled public access using a network policy set to use the company’s IP address range
However, the following error message is received when using SSO to log into the company account:
IP XX.XXX.XX.XX is not allowed to access snowflake. Contact your local security administrator.
What steps should the Architect take to resolve this error and ensure that the account is accessed using only Private Link? (Choose two.)
A. Alter the Azure security integration to use the Private Link URLs.
B. Add the IP address in the error message to the allowed list in the network policy.
C. Generate a new SCIM access token using system$generate_scim_access_token and save it to Azure AD.
D. Update the configuration of the Azure AD SSO to use the Private Link URLs.
E. Open a case with Snowflake Support to authorize the Private Link URLs’ access to the account.
Selected Answer: BD