The Cisco CCNA network associate certification validates the ability to install, configure, operate, and troubleshoot medium-size routed and switched networks, more at http://www.certkingdom.com
Delivery Methods: SAP Certification
Level: Associate
Exam: 80 questions
Sample Questions: View more
Cut Score: 71%
Duration: 180 mins
Languages: English
Description This certification verifies that you possess the fundamental knowledge
required to explain and execute core implementation project tasks to deploy,
adopt and extend SAP S/4HANA Cloud Public Edition as well as core knowledge in
Sales. It proves that the candidate has an overall understanding and technical
skills to participate as a member of a SAP S/4HANA Cloud Public Edition
implementation project team with a focus on Sales in a mentored role.
Topic Areas Please see below the list of topics that may be covered within this
certification and the courses that cover them. Its accuracy does not constitute
a legitimate claim; SAP reserves the right to update the exam content (topics,
items, weighting) at any time.
Topic Areas Please see below the list of topics that may be covered within this
certification and the courses that cover them. Its accuracy does not constitute
a legitimate claim; SAP reserves the right to update the exam content (topics,
items, weighting) at any time.
Solution Processes Implementation for Core Sales 31% - 40% Describe the execution of core Sales Solution Processes and the use and
configuration of related business functions in SAP S/4HANA Cloud Public Edition
S4C60e (SAP Learning Hub only)
S4C61e (SAP Learning Hub only)
S4C62e (SAP Learning Hub only)
S4C63e (SAP Learning Hub only)
----- OR -----
S4C60 Learning Journey
S4C61 Learning Journey
S4C62 Learning Journey
S4C63 Learning Journey
Introduction to Cloud Computing and SAP Cloud ERP Deployment Options <= 10% Describe cloud computing, SAP's enteprise portfolio, and Cloud ERP
deployment options and enablement packages.
S4C01e (SAP Learning Hub only)
----- OR -----
Implement S/4HANA Public Cloud
Implementing with a Cloud Mindset, Building the Team, and Conducting
Fit-to-Standard Workshops <= 10% Implement with a Cloud Mindset, build the implementation team, and conduct
Fit-to-Standard Workshops.
S4C01e (SAP Learning Hub only)
----- OR -----
Implement S/4HANA Public Cloud
Configuration and the SAP Fiori Launchpad <= 10% Configure business processes with SAP Central Business Configuration and
work with the SAP Fiori Launchpad capabilities.
S4C01e (SAP Learning Hub only)
----- OR -----
Implement S/4HANA Public Cloud
Extensibility and Integration <= 10% Customizing applications and processes with extensibility tools and setting
up integrations.
S4C01e (SAP Learning Hub only)
----- OR -----
Implement S/4HANA Public Cloud
Data Migration and Business Process Testing <= 10% Migrate data from legacy systems and test configured business processes with
manual and automated tests.
S4C01e (SAP Learning Hub only)
----- OR -----
Implement S/4HANA Public Cloud
System Landscapes and Identity Access Management <= 10% Perform implementation and configuration tasks for sourcing and procurement
soultion processes.
S4C01e (SAP Learning Hub only)
----- OR -----
Implement S/4HANA Public Cloud
Organizational Units and System Data for Sales <= 10% Explain Organizational Units and the basic best practices for system data
used in SAP S/4HANA Cloud Public Edition, specifically addressing Sales.
S46000 (SAP S/4HANA 2023)
----- OR -----
Applying SAP S/4HANA Sales
Solution Processes Implementation for Analytics <= 10% Describe the use of main analytical solution processes in SAP S/4HANA Cloud
Public Edition.
S4C01e (SAP Learning Hub only)
----- OR -----
Implementing Sales Automation
General Information
Exam Preparation
All SAP consultant certifications are available as Cloud Certifications in the
Certification Hub and can be booked with product code CER006. With CER006 – SAP
Certification in the Cloud, you can take up to six exams attempts of your choice
in one year – from wherever and whenever it suits you! Test dates can be chosen
and booked individually.
Each specific certification comes with its own set of preparation tactics. We
define them as "Topic Areas" and they can be found on each exam description. You
can find the number of questions, the duration of the exam, what areas you will
be tested on, and recommended course work and content you can reference.
Certification exams might contain unscored items that are being tested for
upcoming releases of the exam. These unscored items are randomly distributed
across the certification topics and are not counted towards the final score. The
total number of items of an examination as advertised in the Training Shop is
never exceeded when unscored items are used.
Please be aware that the professional- level certification also requires several
years of practical on-the-job experience and addresses real-life scenarios.
For more information refer to our SAP Certification FAQs.
Safeguarding the Value of Certification
SAP Education has worked hard together with the Certification & Enablement
Influence Council to enhance the value of certification and improve the exams.
An increasing number of customers and partners are now looking towards
certification as a reliable benchmark to safeguard their investments.
Unfortunately, the increased demand for certification has brought with it a
growing number of people who to try and attain SAP certification through unfair
means. This ongoing issue has prompted SAP Education to place a new focus on
test security. Our Certification Test Security Guidelines will help you as test
taker to understand the testing experience.
Security Guidelines
Sample Question and Answers
QUESTION 1 You work on a Sell from Stock (BD9) process in SAP SHANA Cloud Public
Edition.
What must be created to confirm a customer's intention to buy the products?
A. Sales quotation
B. Outbound delivery
C. Sales inquiry
D. Sales order
Answer: D
QUESTION 2 You need to manage a customer down payment. Which action do you perform
during sales order entry?
A. Enter an appropriate item in the billing plan of the sales order.
B. Create a sales order with a dedicated order type.
C. Enter a specific condition in the pricing procedure of the sales order.
D. Mark the down payment checkbox at item level.
Answer: A
QUESTION 3 You are working on a Sales Order Processing with Collective Billing (BKZ)
process in SAP SHANA Cloud Public Edition.
Which of the following split criteria always prevent the combination of multiple
sales orders into a single outbound delivery? Note: There are 2 correct answers
to this question.
A. Plant
B. Shipping point
C. Payment term
D. Ship-to party
Answer: B D
QUESTION 4 Which of the following documents can be used as a reference to create debit
memo requests?
Note: There are 2 correct answers to this question.
A. Billing document
B. Delivery document
C. Quantity contract
D. Sales order
Answer: A D
QUESTION 5 Which information must you enter manually in the invoice correction process?
A. Billing block
B. Order reason
C. Return reason
D. Billing plan
Earn associated certifications Passing this exam is required to earn these certifications. Select each
certification title below to view full requirements.
Oracle Cloud Infrastructure 2024 Certified AI Foundations Associate
Format: Multiple Choice
Duration: 60 Minutes
Exam Price: Free
Number of Questions: 40
Passing Score: 65%
Validation: This Exam has been validated against Oracle Cloud Infrastructure
2024
Policy: Cloud Recertification
Prepare to pass exam: 1Z0-1122-24 The Oracle Cloud Infrastructure (OCI) AI Foundations certification is
designed to introduce learners to the fundamental concepts of artificial
intelligence (AI) and machine learning (ML), with a specific focus on the
practical application of these technologies within the Oracle Cloud
Infrastructure. This course is ideal for beginners and provides an accessible
entry point for those looking to enhance their understanding of AI and ML
without the requirement of prior extensive technical experience.
By participating in this course, you will gain a comprehensive overview of the
AI landscape, including an understanding of basic AI and ML concepts, deep
learning fundamentals, and the role of generative AI and large language models
in modern computing. The course is structured to ensure a step-by-step learning
process, guiding you from the basic principles to more complex topics in AI,
making learning both effective and engaging.
Take recommended training Complete one of the courses below to prepare for your exam (optional):
Become An OCI AI Foundations Associate (2024)
Additional Preparation and Information A combination of Oracle training and hands-on experience (attained via labs
and/or field experience), in the learning subscription, provides the best
preparation for passing the exam.
Review exam topics Objectives % of Exam
Intro to AI Foundations 10%
Intro to ML Foundations 15%
Intro to DL Foundations 15%
Intro to Generative AI & LLMs 15%
Get started with OCI AI Portfolio 15%
OCI Generative AI and Oracle 23ai 10%
Intro to OCI AI Services* 20%
Intro to AI Foundations Discuss AI Basics
Discuss AI Applications & Types of Data
Explain AI vs ML vs DL
Intro to DL Foundations
Discuss Deep Learning Fundamentals
Explain Convolutional Models (CNN)
Explain Sequence Models (RNN & LSTM)
Intro to Generative AI & LLMs Discuss Generative AI Overview
Discuss Large Language Models Fundamentals
Explain Transformers Fundamentals
Explain Prompt Engineering & Instruction Tuning
Explain LLM Fine Tuning
Get started with OCI AI Portfolio Discuss OCI AI Services Overview
Discuss OCI ML Services Overview
Discuss OCI AI Infrastructure Overview
Explain Responsible AI
OCI Generative AI and Oracle 23ai
Describe OCI Generative AI Services
Discuss Autonomous Database Select AI
Discuss Oracle Vector Search
Intro to OCI AI Services* Explore OCI AI Services & related APIs (Language, Vision, Document
Understanding, Speech)
Sample Question and Answers
QUESTION 1
What is the key feature of Recurrent Neural Networks (RNNs)?
A. They process data in parallel.
B. They are primarily used for image recognition tasks.
C. They have a feedback loop that allows information to persist across different
time steps.
D. They do not have an internal state.
Answer: C
Explanation:
Recurrent Neural Networks (RNNs) are a class of neural networks where
connections between nodes
can form cycles. This cycle creates a feedback loop that allows the network to
maintain an internal
state or memory, which persists across different time steps. This is the key
feature of RNNs that
distinguishes them from other neural networks, such as feedforward neural
networks that process
inputs in one direction only and do not have internal states.
RNNs are particularly useful for tasks where context or sequential information
is important, such as
in language modeling, time-series prediction, and speech recognition. The
ability to retain
information from previous inputs enables RNNs to make more informed predictions
based on the
entire sequence of data, not just the current input.
In contrast:
Option A (They process data in parallel) is incorrect because RNNs typically
process data sequentially, not in parallel.
Option B (They are primarily used for image recognition tasks) is incorrect
because image recognition
is more commonly associated with Convolutional Neural Networks (CNNs), not RNNs.
Option D (They do not have an internal state) is incorrect because having an
internal state is a
defining characteristic of RNNs.
This feedback loop is fundamental to the operation of RNNs and allows them to
handle sequences of
data effectively by "remembering" past inputs to influence future outputs. This
memory capability is
what makes RNNs powerful for applications that involve sequential or
time-dependent data .
QUESTION 2
What role do Transformers perform in Large Language Models (LLMs)?
A. Limit the ability of LLMs to handle large datasets by imposing strict memory
constraints
B. Manually engineer features in the data before training the model
C. Provide a mechanism to process sequential data in parallel and capture
long-range dependencies
D. Image recognition tasks in LLMs
Answer: C
Explanation:
Transformers play a critical role in Large Language Models (LLMs), like GPT-4,
by providing an
efficient and effective mechanism to process sequential data in parallel while
capturing long-range
dependencies. This capability is essential for understanding and generating
coherent and
contextually appropriate text over extended sequences of input.
Sequential Data Processing in Parallel:
Traditional models, like Recurrent Neural Networks (RNNs), process sequences of
data one step at a
time, which can be slow and difficult to scale. In contrast, Transformers allow
for the parallel
processing of sequences, significantly speeding up the computation and making it
feasible to train on large datasets.
This parallelism is achieved through the self-attention mechanism, which enables
the model to
consider all parts of the input data simultaneously, rather than sequentially.
Each token (word,
punctuation, etc.) in the sequence is compared with every other token, allowing
the model to weigh
the importance of each part of the input relative to every other part.
Capturing Long-Range Dependencies:
Transformers excel at capturing long-range dependencies within data, which is
crucial for
understanding context in natural language processing tasks. For example, in a
long sentence or
paragraph, the meaning of a word can depend on other words that are far apart in
the sequence. The
self-attention mechanism in Transformers allows the model to capture these
dependencies
effectively by focusing on relevant parts of the text regardless of their
position in the sequence.
This ability to capture long-range dependencies enhances the model's
understanding of context,
leading to more coherent and accurate text generation.
Applications in LLMs:
In the context of GPT-4 and similar models, the Transformer architecture allows
these models to
generate text that is not only contextually appropriate but also maintains
coherence across long
passages, which is a significant improvement over earlier models. This is why
the Transformer is the
foundational architecture behind the success of GPT models.
Reference:
Transformers are a foundational architecture in LLMs, particularly because they
enable parallel
processing and capture long-range dependencies, which are essential for
effective language
understanding and generation .
QUESTION 3 Which is NOT a category of pretrained foundational models available in the
OCI Generative AI service?
A. Embedding models
B. Translation models
C. Chat models
D. Generation models
Answer: B
Explanation:
The OCI Generative AI service offers various categories of pretrained
foundational models, including
Embedding models, Chat models, and Generation models. These models are designed
to perform a
wide range of tasks, such as generating text, answering questions, and providing
contextual
embeddings. However, Translation models, which are typically used for converting
text from one
language to another, are not a category available in the OCI Generative AI
service's current offerings.
The focus of the OCI Generative AI service is more aligned with tasks related to
text generation, chat
interactions, and embedding generation rather than direct language translation .
QUESTION 4 What does "fine-tuning" refer to in the context of OCI Generative AI
service?
A. Encrypting the data for security reasons
B. Adjusting the model parameters to improve accuracy
C. Upgrading the hardware of the AI clusters
D. Doubling the neural network layers
Answer: B
Explanation:
Fine-tuning in the context of the OCI Generative AI service refers to the
process of adjusting the
parameters of a pretrained model to better fit a specific task or dataset. This
process involves further
training the model on a smaller, task-specific dataset, allowing the model to
refine its understanding
and improve its performance on that specific task. Fine-tuning is essential for
customizing the
general capabilities of a pretrained model to meet the particular needs of a
given application,
resulting in more accurate and relevant outputs. It is distinct from other
processes like encrypting
data, upgrading hardware, or simply increasing the complexity of the model
architecture .
QUESTION 5
What is the primary benefit of using Oracle Cloud Infrastructure Supercluster
for AI workloads?
A. It delivers exceptional performance and scalability for complex AI tasks.
B. It is ideal for tasks such as text-to-speech conversion.
C. It offers seamless integration with social media platforms.
D. It provides a cost-effective solution for simple AI tasks.
Answer: A
Explanation:
Oracle Cloud Infrastructure Supercluster is designed to deliver exceptional
performance and
scalability for complex AI tasks. The primary benefit of this infrastructure is
its ability to handle
demanding AI workloads, offering high-performance computing (HPC) capabilities
that are crucial for
training large-scale AI models and processing massive datasets. The architecture
of the Supercluster
ensures low-latency networking, efficient resource allocation, and
high-throughput processing,
making it ideal for AI tasks that require significant computational power, such
as deep learning, data
analytics, and large-scale simulations .
QUESTION 6
Which AI Ethics principle leads to the Responsible AI requirement of
transparency?
A. Explicability
B. Prevention of harm
C. Respect for human autonomy
D. Fairness
Length: Two hours
Registration fee: $ (plus tax where applicable)
Language: English
Exam format: 50-60 multiple choice and multiple select questions
Exam delivery method: a. Take the online-proctored exam from a remote location, review the online
testing requirements.
b. Take the onsite-proctored exam at a testing center, locate a test center near
you
Prerequisites: None Recommended experience: 3+ years of industry experience including 1 or more
years designing and managing solutions using Google Cloud.
Certification Renewal / Recertification: Candidates must recertify in order to
maintain their certification status. Unless explicitly stated in the detailed
exam descriptions, all Google Cloud certifications are valid for two years from
the date of certification. Recertification is accomplished by retaking the exam
during the recertification eligibility time period and achieving a passing
score. You may attempt recertification starting 60 days prior to your
certification expiration date.
Exam overview
Step 1: Get real world experience
Before attempting the Machine Learning Engineer exam, it's recommended that you
have 3+ years of hands-on experience with Google Cloud products and solutions.
Ready to start building? Explore the Google Cloud Free Tier for free usage (up
to monthly limits) of select products.
Step 2: Understand what's on the exam The exam guide contains a complete list of topics that may be included on
the exam. Review the exam guide to determine if your skills align with the
topics on the exam.
See current exam guide
Step 3: Review the sample questions Familiarize yourself with the format of questions and example content that
may be covered on the Machine Learning Engineer exam.
Review sample questions
Step 4: Round out your skills with training
Prepare for the exam by following the Machine Learning Engineer learning path.
Explore online training, in-person classes, hands-on labs, and other resources
from Google Cloud.
Start preparing
Prepare for the exam with Googlers and certified experts. Get valuable exam tips
and tricks, as well as insights from industry experts.
Explore Google Cloud documentation for in-depth discussions on the concepts and
critical components of Google Cloud.
Learn about designing, training, building, deploying, and operationalizing
secure ML applications on Google Cloud using the Official Google Cloud Certified
Professional Machine Learning Engineer Study Guide. This guide uses real-world
scenarios to demonstrate how to use the Vertex AI platform and technologies such
as TensorFlow, Kubeflow, and AutoML, as well as best practices on when to choose
a pretrained or a custom model.
Step 5: Schedule an exam
Register and select the option to take the exam remotely or at a nearby testing
center.
Review exam terms and conditions and data sharing policies.
A Professional Machine Learning Engineer builds, evaluates, productionizes, and
optimizes ML models by using Google Cloud technologies and knowledge of proven
models and techniques. The ML Engineer handles large, complex datasets and
creates repeatable, reusable code. The ML Engineer considers responsible AI and
fairness throughout the ML model development process, and collaborates closely
with other job roles to ensure long-term success of ML-based applications. The
ML Engineer has strong programming skills and experience with data platforms and
distributed data processing tools. The ML Engineer is proficient in the areas of
model architecture, data and ML pipeline creation, and metrics interpretation.
The ML Engineer is familiar with foundational concepts of MLOps, application
development, infrastructure management, data engineering, and data governance.
The ML Engineer makes ML accessible and enables teams across the organization.
By training, retraining, deploying, scheduling, monitoring, and improving
models, the ML Engineer designs and creates scalable, performant solutions.
* Note: The exam does not directly assess coding skill. If you have a minimum
proficiency in Python and Cloud SQL, you should be able to interpret any
questions with code snippets.
The Professional Machine Learning Engineer exam assesses your ability to:
Architect low-code ML solutions
Collaborate within and across teams to manage data and models
Scale prototypes into ML models
Serve and scale models
Automate and orchestrate ML pipelines
Monitor ML solutions
Sample Question and Answers
QUESTION 1 As the lead ML Engineer for your company, you are responsible for building
ML models to digitize
scanned customer forms. You have developed a TensorFlow model that converts the
scanned images
into text and stores them in Cloud Storage. You need to use your ML model on the
aggregated data
collected at the end of each day with minimal manual intervention. What should
you do?
A. Use the batch prediction functionality of Al Platform
B. Create a serving pipeline in Compute Engine for prediction
C. Use Cloud Functions for prediction each time a new data point is ingested
D. Deploy the model on Al Platform and create a version of it for online
inference.
Answer: A
Explanation:
Batch prediction is the process of using an ML model to make predictions on a
large set of data
points. Batch prediction is suitable for scenarios where the predictions are not
time-sensitive and can
be done in batches, such as digitizing scanned customer forms at the end of each
day. Batch
prediction can also handle large volumes of data and scale up or down the
resources as needed. AI
Platform provides a batch prediction service that allows users to submit a job
with their TensorFlow
model and input data stored in Cloud Storage, and receive the output predictions
in Cloud Storage as
well. This service requires minimal manual intervention and can be automated
with Cloud Scheduler
or Cloud Functions. Therefore, using the batch prediction functionality of AI
Platform is the best
option for this use case.
Reference:
Batch prediction overview
Using batch prediction
QUESTION 2 You work for a global footwear retailer and need to predict when an item
will be out of stock based
on historical inventory data. Customer behavior is highly dynamic since footwear
demand is influenced by many different
factors. You want to serve models that are trained on all available data, but
track your performance
on specific subsets of data before pushing to production. What is the most
streamlined and reliable
way to perform this validation?
A. Use the TFX ModelValidator tools to specify performance metrics for
production readiness
B. Use k-fold cross-validation as a validation strategy to ensure that your
model is ready forproduction.
C. Use the last relevant week of data as a validation set to ensure that your
model is performingaccurately on current data
D. Use the entire dataset and treat the area under the receiver operating
characteristics curve (AUC ROC) as the main metric.
Answer: A
Explanation:
TFX ModelValidator is a tool that allows you to compare new models against a
baseline model and
evaluate their performance on different metrics and data slices1. You can use
this tool to validate
your models before deploying them to production and ensure that they meet your
expectations and requirements.
k-fold cross-validation is a technique that splits the data into k subsets and
trains the model on k-1
subsets while testing it on the remaining subset. This is repeated k times and
the average
performance is reported2. This technique is useful for estimating the
generalization error of a model,
but it does not account for the dynamic nature of customer behavior or the
potential changes in data distribution over time.
Using the last relevant week of data as a validation set is a simple way to
check the models
performance on recent data, but it may not be representative of the entire data
or capture the longterm
trends and patterns. It also does not allow you to compare the model with a
baseline or evaluate it on different data slices.
Using the entire dataset and treating the AUC ROC as the main metric is not a
good practice because
it does not leave any data for validation or testing. It also assumes that the
AUC ROC is the only
metric that matters, which may not be true for your business problem. You may
want to consider
other metrics such as precision, recall, or revenue.
QUESTION 3 You work on a growing team of more than 50 data scientists who all use Al
Platform. You are
designing a strategy to organize your jobs, models, and versions in a clean and
scalable way. Which strategy should you choose?
A. Set up restrictive I AM permissions on the Al Platform notebooks so that only
a single user or group can access a given instance.
B. Separate each data scientist's work into a different project to ensure that
the jobs, models, and versions created by each data scientist are accessible
only to that user.
C. Use labels to organize resources into descriptive categories. Apply a label
to each created resource so that users can filter the results by label when
viewing or monitoring the resources
D. Set up a BigQuery sink for Cloud Logging logs that is appropriately filtered
to capture information about Al Platform resource usage In BigQuery create a SQL
view that maps users to the resources they are using.
Answer: C
Explanation:
Labels are key-value pairs that can be attached to any AI Platform resource,
such as jobs, models,
versions, or endpoints1. Labels can help you organize your resources into
descriptive categories, such
as project, team, environment, or purpose. You can use labels to filter the
results when you list or
monitor your resources, or to group them for billing or quota purposes2. Using
labels is a simple and
scalable way to manage your AI Platform resources without creating unnecessary
complexity or overhead.
Therefore, using labels to organize resources is the best strategy for this use
case.
Reference:
Using labels
Filtering and grouping by labels
QUESTION 4 During batch training of a neural network, you notice that there is an
oscillation in the loss. How should you adjust your model to ensure that it
converges?
A. Increase the size of the training batch
B. Decrease the size of the training batch
C. Increase the learning rate hyperparameter
D. Decrease the learning rate hyperparameter
Answer: D
Explanation:
Oscillation in the loss during batch training of a neural network means that the
model is
overshooting the optimal point of the loss function and bouncing back and forth.
This can prevent
the model from converging to the minimum loss value. One of the main reasons for
this
phenomenon is that the learning rate hyperparameter, which controls the size of
the steps that the
model takes along the gradient, is too high. Therefore, decreasing the learning
rate hyperparameter
can help the model take smaller and more precise steps and avoid oscillation.
This is a common
technique to improve the stability and performance of neural network training12.
Reference:
Interpreting Loss Curves
Is learning rate the only reason for training loss oscillation after few epochs?
QUESTION 5 You are building a linear model with over 100 input features, all with
values between -1 and 1.
You suspect that many features are non-informative. You want to remove the
non-informative features
from your model while keeping the informative ones in their original form. Which
technique should you use?
A. Use Principal Component Analysis to eliminate the least informative features.
B. Use L1 regularization to reduce the coefficients of uninformative features to
0.
C. After building your model, use Shapley values to determine which features are
the most informative.
D. Use an iterative dropout technique to identify which features do not degrade
the model when removed.
Answer: B
Explanation:
L1 regularization, also known as Lasso regularization, adds the sum of the
absolute values of the
models coefficients to the loss function1. It encourages sparsity in the model
by shrinking some
coefficients to precisely zero2. This way, L1 regularization can perform feature
selection and remove
the non-informative features from the model while keeping the informative ones
in their original
form. Therefore, using L1 regularization is the best technique for this use
case.
Reference:
Regularization in Machine Learning - GeeksforGeeks
Regularization in Machine Learning (with Code Examples) - Dataquest
L1 And L2 Regularization Explained & Practical How To Examples
L1 and L2 as Regularization for a Linear Model
QUESTION 6 Your team has been tasked with creating an ML solution in Google Cloud to
classify support requests
for one of your platforms. You analyzed the requirements and decided to use
TensorFlow to build the
classifier so that you have full control of the model's code, serving, and
deployment. You will use
Kubeflow pipelines for the ML platform. To save time, you want to build on
existing resources and
use managed services instead of building a completely new model. How should you
build the classifier?
A. Use the Natural Language API to classify support requests
B. Use AutoML Natural Language to build the support requests classifier
C. Use an established text classification model on Al Platform to perform
transfer learning
D. Use an established text classification model on Al Platform as-is to classify
support requests
Answer: C
Explanation:
Transfer learning is a technique that leverages the knowledge and weights of a
pre-trained model
and adapts them to a new task or domain1. Transfer learning can save time and
resources by
avoiding training a model from scratch, and can also improve the performance and
generalization of
the model by using a larger and more diverse dataset2. AI Platform provides
several established text
classification models that can be used for transfer learning, such as BERT,
ALBERT, or XLNet3. These
models are based on state-of-the-art natural language processing techniques and
can handle various
text classification tasks, such as sentiment analysis, topic classification, or
spam detection4. By using
one of these models on AI Platform, you can customize the models code, serving,
and deployment,
and use Kubeflow pipelines for the ML platform. Therefore, using an established
text classification
model on AI Platform to perform transfer learning is the best option for this
use case.
Reference:
Transfer Learning - Machine Learnings Next Frontier
Updates to the exam Our exams are updated periodically to reflect skills that are required to
perform a role. We have included two versions of the Skills Measured objectives
depending on when you are taking the exam.
We always update the English language version of the exam first. Some exams are
localized into other languages, and those are updated approximately eight weeks
after the English version is updated. While Microsoft makes every effort to
update localized versions as noted, there may be times when localized versions
of an exam are not updated on this schedule. Other available languages are
listed in the Schedule Exam section of the Exam Details webpage. If the exam
isn't available in your preferred language, you can request an additional 30
minutes to complete the exam.
Note The bullets that follow each of the skills measured are intended to
illustrate how we are assessing that skill. Related topics may be covered in the
exam.
Note Most questions cover features that are general availability (GA). The exam
may contain questions on Preview features if those features are commonly used.
Skills measured as of January 23, 2024
Audience profile
As a Microsoft cybersecurity architect, you translate a cybersecurity strategy
into capabilities that protect the assets, business, and operations of an
organization. You design, guide the implementation of, and maintain security
solutions that follow Zero Trust principles and best practices, including
security strategies for:
Identity
Devices
Data
Applications
Network
Infrastructure
DevOps
Plus, you design solutions for:
Governance and Risk Compliance (GRC)
Security operations
Security posture management
As a cybersecurity architect, you continuously collaborate with leaders and
practitioners in IT security, privacy, and other roles across an organization to
plan and implement a cybersecurity strategy that meets the business needs of an
organization.
As a candidate for this exam, you have experience implementing or administering
solutions in the following areas:
Identity and access
Platform protection
Security operations
Data security
Application security
Hybrid and multicloud infrastructures
You should have expert skills in at least one of those areas, and you should
have experience designing security solutions that include Microsoft security
technologies.
Skills at a glance
Design solutions that align with security best practices and priorities (20–25%)
Design security operations, identity, and compliance capabilities (30–35%)
Design security solutions for infrastructure (20–25%)
Design security solutions for applications and data (20–25%)
Design solutions that align with security best practices and priorities (20–25%)
Design a resiliency strategy for ransomware and other attacks based on Microsoft
Security Best Practices
Design a security strategy to support business resiliency goals, including
identifying and prioritizing threats to business-critical assets
Design solutions that align with Microsoft ransomware best practices, including
backup, restore, and privileged access
Design configurations for secure backup and restore by using Azure Backup for
hybrid and multicloud environments
Design solutions for security updates
Design solutions that align with the Microsoft Cybersecurity Reference
Architectures (MCRA) and Microsoft cloud security benchmark (MCSB)
Design solutions that align with best practices for cybersecurity capabilities
and controls
Design solutions that align with best practices for protecting against insider
and external attacks
Design solutions that align with best practices for Zero Trust security,
including the Zero Trust Rapid Modernization Plan (RaMP)
Design solutions that align with the Microsoft Cloud Adoption Framework for
Azure and the Microsoft Azure Well-Architected Framework
Design a new or evaluate an existing strategy for security and governance based
on the Microsoft Cloud Adoption Framework (CAF) for Azure and the Microsoft
Azure Well-Architected Framework
Recommend solutions for security and governance based on the Microsoft Cloud
Adoption Framework for Azure and the Microsoft Azure Well-Architected Framework
Design solutions for implementing and governing security by using Azure landing
zones
Design a DevSecOps process
Design security operations, identity, and compliance capabilities (30–35%) Design solutions for security operations
Develop security operations capabilities to support a hybrid or multicloud
environment
Design a solution for centralized logging and auditing
Design a solution for security information and event management (SIEM),
including Microsoft Sentinel
Design a solution for detection and response that includes extended detection
and response (XDR)
Design a solution for security orchestration automated response (SOAR),
including Microsoft Sentinel and Microsoft Defender
Design and evaluate security workflows, including incident response, threat
hunting, incident management, and threat intelligence
Design and evaluate threat detection coverage by using MITRE ATT&CK
Design solutions for identity and access management
Design a solution for access to software as a service (SaaS), platform as a
service (PaaS), infrastructure as a service (IaaS), hybrid/on-premises, and
multicloud resources, including identity, networking, and application controls
Design a solution for Microsoft Entra ID, including hybrid and multi-cloud
environments
Design a solution for external identities, including business-to-business (B2B),
business-to-customer (B2C), and decentralized identity
Design a modern authentication and authorization strategy, including Conditional
Access, continuous access evaluation, threat intelligence integration, and risk
scoring
Validate the alignment of Conditional Access policies with a Zero Trust strategy
Specify requirements to secure Active Directory Domain Services (AD DS)
Design a solution to manage secrets, keys, and certificates
Design solutions for securing privileged access
Design a solution for assigning and delegating privileged roles by using the
enterprise access model
Design an identity governance solution, including Microsoft Entra Privileged
Identity Management (PIM), privileged access management, entitlement management,
and access reviews
Design a solution for securing the administration of cloud tenants, including
SaaS and multicloud infrastructure and platforms
Design a solution for cloud infrastructure entitlement management that includes
Microsoft Entra Permissions Management
Design a solution for Privileged Access Workstation (PAW) and bastion services
Design solutions for regulatory compliance
Translate compliance requirements into a security solution
Design a solution to address compliance requirements by using Microsoft Purview
risk and compliance solutions
Design a solution to address privacy requirements, including Microsoft Priva
Design Azure Policy solutions to address security and compliance requirements
Evaluate infrastructure compliance by using Microsoft Defender for Cloud
Design security solutions for infrastructure (20–25%) Design solutions for security posture management in hybrid and multicloud
environments
Evaluate security posture by using MCSB
Evaluate security posture by using Microsoft Defender for Cloud
Evaluate security posture by using Microsoft Secure Score
Design integrated security posture management and workload protection solutions
in hybrid and multi-cloud environments, including Microsoft Defender for Cloud
Design cloud workload protection solutions that use Microsoft Defender for
Cloud, such as Microsoft Defender for Servers, Microsoft Defender for App
Service, and Microsoft Defender for SQL
Design a solution for integrating hybrid and multicloud environments by using
Azure Arc
Design a solution for Microsoft Defender External Attack Surface Management
(Defender EASM)
Design solutions for securing server and client endpoints
Specify security requirements for servers, including multiple platforms and
operating systems
Specify security requirements for mobile devices and clients, including endpoint
protection, hardening, and configuration
Specify security requirements for IoT devices and embedded systems
Design a solution for securing operational technology (OT) and industrial
control systems (ICS) by using Microsoft Defender for IoT
Specify security baselines for server and client endpoints
Design a solution for secure remote access
Specify requirements for securing SaaS, PaaS, and IaaS services
Specify security baselines for SaaS, PaaS, and IaaS services
Specify security requirements for IoT workloads
Specify security requirements for web workloads, including Azure App Service
Specify security requirements for containers
Specify security requirements for container orchestration
Design security solutions for applications and data (20–25%) Design solutions for securing Microsoft 365
Evaluate security posture for productivity and collaboration workloads by using
metrics, including Microsoft Secure Score and Microsoft Defender for Cloud
secure score
Design a Microsoft 365 Defender solution
Design secure configurations and operational practices for Microsoft 365
workloads and data
Design solutions for securing applications
Evaluate the security posture of existing application portfolios
Evaluate threats to business-critical applications by using threat modeling
Design and implement a full lifecycle strategy for application security
Design and implement standards and practices for securing the application
development process
Map technologies to application security requirements
Design a solution for workload identity to authenticate and access Azure cloud
resources
Design a solution for API management and security
Design a solution for secure access to applications, including Azure Web
Application Firewall (WAF) and Azure Front Door
Design solutions for securing an organization's data
Design a solution for data discovery and classification by using Microsoft
Purview data governance solutions
Specify priorities for mitigating threats to data
Design a solution for protection of data at rest, data in motion, and data in
use
Design a security solution for data in Azure workloads, including Azure SQL,
Azure Synapse Analytics, and Azure Cosmos DB
Design a security solution for data in Azure Storage
Design a security solution that includes Microsoft Defender for Storage and
Microsoft Defender for SQL
Sample Questions and Answers
New Topic: Topic 1, Fabrikam, Inc Case Study 1
OverView
Fabrikam, Inc. is an insurance company that has a main office in New York and a
branch office in Paris.
On-premises Environment
The on-premises network contains a single Active Directory Domain Services (AD
DS) domain named corp.fabrikam.com.
Azure Environment
Fabrikam has the following Azure resources:
An Azure Active Directory (Azure AD) tenant named fabrikam.onmicrosoft.com that
syncs with corp.fabnkam.com
A single Azure subscription named Sub1
A virtual network named Vnet1 in the East US Azure region
A virtual network named Vnet2 in the West Europe Azure region
An instance of Azure Front Door named FD1 that has Azure Web Application
Firewall (WAR enabled
A Microsoft Sentinel workspace
An Azure SQL database named ClaimsDB that contains a table named ClaimDetails
20 virtual machines that are configured as application servers and are NOT
onboarded to
Microsoft Defender for Cloud
A resource group named TestRG that is used for testing purposes only
An Azure Virtual Desktop host pool that contains personal assigned session hosts
All the resources in Sub1 are in either the East US or the West Europe region.
Partners
Fabrikam has contracted a company named Contoso, Ltd. to develop applications.
Contoso has the following infrastructure-.
An Azure AD tenant named contoso.onmicrosoft.com
An Amazon Web Services (AWS) implementation named ContosoAWS1 that contains AWS
EC2
instances used to host test workloads for the applications of Fabrikam
Developers at Contoso will connect to the resources of Fabrikam to test or
update applications. The
developers will be added to a security Group named Contoso Developers in
fabrikam.onmicrosoft.com that will be assigned to roles in Sub1.
The ContosoDevelopers group is assigned the db.owner role for the ClaimsDB
database.
Compliance Event
Fabrikam deploys the following compliance environment:
Defender for Cloud is configured to assess all the resources in Sub1 for
compliance to the HIPAA HITRUST standard.
Currently, resources that are noncompliant with the HIPAA HITRUST standard are
remediated manually.
Qualys is used as the standard vulnerability assessment tool for servers.
Problem Statements
The secure score in Defender for Cloud shows that all the virtual machines
generate the following
recommendation-. Machines should have a vulnerability assessment solution.
All the virtual machines must be compliant in Defender for Cloud.
ClaimApp Deployment
Fabrikam plans to implement an internet-accessible application named ClaimsApp
that will have the following specification
ClaimsApp will be deployed to Azure App Service instances that connect to Vnetl
and Vnet2.
Users will connect to ClaimsApp by using a URL of https://claims.fabrikam.com.
ClaimsApp will access data in ClaimsDB.
ClaimsDB must be accessible only from Azure virtual networks.
The app services permission for ClaimsApp must be assigned to ClaimsDB.
Application Development Requirements
Fabrikam identifies the following requirements for application development:
Azure DevTest labs will be used by developers for testing.
All the application code must be stored in GitHub Enterprise.
Azure Pipelines will be used to manage application deployments.
All application code changes must be scanned for security vulnerabilities,
including application
code or configuration files that contain secrets in clear text. Scanning must be
done at the time the
code is pushed to a repository.
Security Requirement
Fabrikam identifies the following security requirements:
Internet-accessible applications must prevent connections that originate in
North Korea.
Only members of a group named InfraSec must be allowed to configure network
security groups
(NSGs} and instances of Azure Firewall, VJM. And Front Door in Sub1.
Administrators must connect to a secure host to perform any remote
administration of the virtual
machines. The secure host must be provisioned from a custom operating system
image.
AWS Requirements
Fabrikam identifies the following security requirements for the data hosted in
ContosoAWSV.
Notify security administrators at Fabrikam if any AWS EC2 instances are
noncompliant with secure
score recommendations.
Ensure that the security administrators can query AWS service logs directly from
the Azure environment.
Contoso Developer Requirements
Fabrikam identifies the following requirements for the Contoso developers;
Every month, the membership of the ContosoDevelopers group must be verified.
The Contoso developers must use their existing contoso.onmicrosoft.com
credentials to access the resources in Sub1.
The Comoro developers must be prevented from viewing the data in a column named
MedicalHistory in the ClaimDetails table.
Compliance Requirement
Fabrikam wants to automatically remediate the virtual machines in Sub1 to be
compliant with the
HIPPA HITRUST standard. The virtual machines in TestRG must be excluded from the
compliance assessment.
QUESTION 1
You need to recommend a solution to meet the security requirements for the
InfraSec group.
What should you use to delegate the access?
A. a subscription
B. a custom role-based access control (RBAC) role
C. a resource group
D. a management group
Answer: B
QUESTION 2 You need to recommend a solution to scan the application code. The solution
must meet the
application development requirements. What should you include in the
recommendation?
A. Azure Key Vault
B. GitHub Advanced Security
C. Application Insights in Azure Monitor
D. Azure DevTest Labs
Answer: B
QUESTION 3 You need to recommend a solution to resolve the virtual machine issue. What
should you include in the recommendation? (Choose Two)
A. Onboard the virtual machines to Microsoft Defender for Endpoint.
B. Onboard the virtual machines to Azure Arc.
C. Create a device compliance policy in Microsoft Endpoint Manager.
D. Enable the Qualys scanner in Defender for Cloud.
Answer: A, B
QUESTION 4
HOTSPOT
What should you create in Azure AD to meet the Contoso developer requirements?
Answer:
Box 1: A synced user account -
Need to use a synched user account.
Box 2: An access review
QUESTION 5 HOTSPOT
You are evaluating the security of ClaimsApp.
For each of the following statements, select Yes if the statement is true.
Otherwise, select No.
NOTE; Each correct selection is worth one point.
Answer:
Explanation:
QUESTION 6
You need to recommend a solution to secure the MedicalHistory data in the
ClaimsDetail table. The
solution must meet the Contoso developer requirements.
What should you include in the recommendation?
A. Transparent Data Encryption (TDE)
B. Always Encrypted
C. row-level security (RLS)
D. dynamic data masking
E. data classification
Answer: D
QUESTION 7 HOTSPOT
You need to recommend a solution to meet the AWS requirements.
What should you include in the recommendation? To answer, select the appropriate
options in the
answer area.
NOTE: Each correct selection is worth one point.
Answer:
QUESTION 8
You need to recommend a solution to meet the security requirements for the
virtual machines.
What should you include in the recommendation?
A. an Azure Bastion host
B. a network security group (NSG)
C. just-in-time (JIT) VM access
D. Azure Virtual Desktop
Answer: A
QUESTION 9 HOTSPOT
You need to recommend a solution to meet the requirements for connections to
ClaimsDB.
What should you recommend using for each requirement? To answer, select the
appropriate options
in the answer area.
NOTE: Each correct selection is worth one point.
Answer:
Explanation:
QUESTION 10
HOTSPOT
You need to recommend a solution to meet the compliance requirements.
What should you recommend? To answer, select the appropriate options in the
answer area.
NOTE: Each correct selection is worth one point.
Format: Multiple Choice
Duration: 60 minutes
Exam Price: $
Number of Questions: 30
Passing Score: 65%
Validation: This exam has been validated against Oracle Cloud Infrastructure
2023
Policy: Cloud Recertification
Earn associated certifications Passing this exam is required to earn these certifications. Select each
certification title below to view full requirements.
Prepare to pass exam: 1Z0-1115-23 The Oracle Cloud Infrastructure 2023 Multicloud Architect Associate
certification is designed to test an individual's expertise in designing and
implementing Oracle Cloud Infrastructure (OCI) multicloud solutions. This
certification aims to evaluate the candidate's ability to use a combination of
cloud services to build a multicloud environment.
Take recommended training Complete one of the courses below to prepare for your exam (optional):
Become an OCI Multicloud Architect Associate
Practice Exam Additional Preparation and Information
A combination of Oracle training and hands-on experience (attained via labs
and/or field experience), in the learning subscription, provides the best
preparation for passing the exam.
Review exam topics The following table lists the exam objectives and their weightage.
Objectives % in Exam
Introduction to Multi-cloud 10%
Core OCI Services 20%
Multicloud Connection Options 30%
Oracle Database Service for Azure 40%
Introduction to Multi-cloud [10%] Understand multi-cloud and its benefits
Explain common multi-cloud use cases and their implementation in OCI
Core OCI Services [20%]
Federate OCI Identity Domains with identity providers
Implement and manage VCN components
Administer OCI Database services, such as Base Databases, Autonomous Databases,
MySQL Database
Multi-cloud Connection Options [30%] Understand OCI multi-cloud connectivity options, such as Site-to-Site VPN
and FastConnect
Implement OCI-Azure Interconnect
Oracle Database Service for Azure [40%] Explain the prerequisites and onboarding options for Oracle Database Service
for Azure
Implement Oracle Database Service for Azure
Configure common Oracle Database Service for Azure database management tasks
Sample Questions and Answers
QUESTION 1 What is the purpose of the SAML metadata file in the OCI Federation setup
with Azure Active Directory (AD)?
A. It is used to exchange metadata information between Azure AD and OCI.
B. It is used to configure attribute mapping between Azure AD and OCI.
C. It is used to establish trust between Azure AD and OCI.
D. It is used to store user credentials for authentication.
Answer: A
Explanation:
In general, SAML metadata is used to share configuration information between the
Identity Pro-vider (IdP) and the Service Provider (SP).
QUESTION 2
A company wants to seamlessly build a private interconnection between their
OCI and Microsoft
Azure environments with consistent performance and low latency. They want to
enable their cloud
engineers to set up Single Sign-On (SSO) between Microsoft Azure and OCI for
their Oracle applications,
such as PeopleSoft, JD Edwards EnterpriseOne, and E-Business Suite.
Which technology integration can the company use to achieve this goal?
A. Direct Connect and Azure VPN Gateway
B. OCI Site-to-Site VPN and Azure Site-to-Site VPN
C. Oracle FastConnect and Azure ExpressRoute
D. Cloud Interconnect and Virtual WAN
Answer: C
Explanation:
By using Oracle FastConnect and Azure ExpressRoute, customers can seamlessly
build a private
interconnection between their OCI and Microsoft Azure environments. The
Interconnect also enables
joint customers to take advantage of a unified identity and access management
platform that
leads to cost savings. Cloud engineers can set up SSO between Microsoft Azure
and OCI for their
Oracle applications, such as PeopleSoft, JD Edwards EnterpriseOne, and
E-Business Suite. Having a
federated SSO makes the integration seamless and allows users to authenticate
only once to access
multiple applications, without signing in separately to access each application.
QUESTION 3 Which components are required to establish a Site-to-Site VPN connection in
Oracle Cloud Infrastructure?
A. Internet Gateway, Customer Premises Equipment (CPE), and IPsec tunnel
B. Internet Gateway (IG), Network Address Translation (NAT) Gateway, and IPsec
tun-nel
C. Dynamic Routing Gateway (DRG), Customer Premises Equipment (CPE), and IPsec
tunnel
D. Dynamic Routing Gateway (DRG), NAT Gateway, and IPsec tunnel
Answer: C
Explanation:
Site-to-Site VPN Components:
CPE OBJECT: At your end of Site-to-Site VPN is the actual device in your
on-premises network
(whether hardware or software). The term customer-premises equipment (CPE) is
commonly used in
some industries to refer to this type of on-premises equipment.
DYNAMIC ROUTING GATEWAY (DRG): At Oracle's end of Site-to-Site VPN is a virtual
router called a
dynamic routing gateway, which is the gateway into your VCN from your
on-premises network.
IPSEC CONNECTION: After creating the CPE object and DRG, you connect them by
creating an IPSec
connection, which you can think of as a parent object that represents the
Site-to-Site VPN.
TUNNEL: An IPSec tunnel is used to encrypt traffic between secure IPSec
endpoints. Oracle cre-ates
two tunnels in each IPSec connection for redundancy.
So, Internet Gateway, NAT Gateway are NOT valid Site-to-Site VPN Components.
Hence, Dynamic Routing Gateway (DRG), Customer Premises Equipment (CPE), and
IPsec tunnel is
the CORRECT answer.
QUESTION 4 What should you do to prepare your Oracle Cloud Infrastructure (OCI) Virtual
Cloud Network (VCN)
for potential security risks when connected to a Microsoft Azure VNet?
A. Allow all traffic from the Azure VNet without restrictions.
B. Limit all inbound and outbound traffic from the Azure VNet to expected and
well-defined traffic.
C. Remove all OCI security rules.
D. Disable the connection between Azure VNet and OCI VCN.
Answer: B
Explanation:
Controlling Traffic Flow Over the Connection
Even if a connection has been established between your VCN and VNet, you can
control the packet
flow over the connection with route tables in your VCN. For example, you can
restrict traf-fic to only
specific subnets in the VNet.
Controlling the Specific Types of Traffic Allowed
It's important that you ensure that all outbound and inbound traffic with the
VNet is intended or
expected and well defined. Implement Azure network security group and Oracle
security rules that
explicitly state the types of traffic one cloud can send to the other and accept
from the other.
QUESTION 5
Which is a database service that CANNOT be provisioned in the Oracle Public
Cloud?
A. Autonomous Database on Dedicated Infrastructure
B. Exadata Database Service on Shared Infrastructure
C. Autonomous Database on Shared Infrastructure
D. Exadata Database Service on Dedicated Infrastructure
Answer: B
Explanation:
As you can see in the screenshot, Exadata Database Service on Shared
Infrastructure is NOT supported.
Students reviews and Discussions
Packiam Vijendran 1 months ago - Malaysia
Passed the exam yesterday, 95% of the question were from this site. Note: Pay
more attention to all the community discussions on each question, instead of the
answers provided by the examtopics and I strongly suggest to get the contributor
access.
upvoted 4 times
Javier Cardaba Enjuto 2 months, 1 week ago - Spain
Excellent pre-exam session tool
upvoted 2 times
Palanisamy Arulmohan 1 months, 1 week ago - USA
I passed today, 94 questions asked and 99% of them were in this dump.
3 labs: BGP (as-override), HSRP, OSPF (without network statement)
upvoted 4 times
peppinauz 3 months, 2 weeks ago
I pass my exam, dump is valid about 90-95%. review the community answers!!
upvoted 6 times
Oberoi Ankit3 months, 3 weeks ago - USA Texas
Passed exam today dump still accurate. almost all the questions are here, some
are overcomplicated or incomplete on the site,
upvoted 4 times
AZ-140 Configuring and Operating Windows Virtual Desktop on Microsoft Azure
Exam Skills measured prior to October 26, 2023
Skills measured prior to October 26, 2023
Audience profile Candidates for this exam are server or desktop administrators with subject
matter expertise in designing, implementing, managing, and maintaining Microsoft
Azure Virtual Desktop experiences and remote apps for any device.
To deliver these experiences, they work closely with Azure administrators, Azure
architects, Microsoft 365 administrators, and Azure security engineers.
Candidates for this exam should have experience with Azure technologies,
including virtualization, networking, identity, storage, and resiliency. They
should be able to manage end-user desktop environments, including delivering
applications and configuring user settings. These professionals use the Azure
portal, templates, scripting, and command-line tools to manage an Azure Virtual
Desktop deployment.
Skills at a glance
Plan and implement an Azure Virtual Desktop infrastructure (40–45%)
Plan and implement identity and security (15–20%)
Plan and implement user environments and apps (20–25%)
Monitor and maintain an Azure Virtual Desktop infrastructure (10–15%)
Plan and implement an Azure Virtual Desktop infrastructure (40–45%) Plan, implement, and manage networking for Azure Virtual Desktop
Assess network capacity and speed requirements for Azure Virtual Desktop
Calculate and recommend a configuration for network requirements
Plan and implement Azure virtual network connectivity
Manage connectivity to the internet and on-premises networks
Plan and implement RDP Shortpath and quality of service (QoS) policies
Plan and implement name resolution for Azure Virtual Desktop
Monitor and troubleshoot network connectivity
Plan and implement storage for Azure Virtual Desktop user data
Plan storage for Azure Virtual Desktop user data
Implement storage for FSLogix components
Implement storage accounts
Implement file shares
Implement Azure NetApp Files
Plan host pools and session hosts
Recommend resource groups, subscriptions, and management groups
Recommend an operating system (OS) for an Azure Virtual Desktop implementation
Recommend an appropriate licensing model for Azure Virtual Desktop based on
requirements
Plan a host pools architecture
Calculate and recommend a configuration for performance requirements
Calculate and recommend a configuration for Azure Virtual Machines capacity
requirements
Implement host pools and session hosts
Create host pools and session hosts by using the Azure portal
Automate creation of Azure Virtual Desktop hosts and host pools by using
PowerShell, Azure CLI, Azure Resource Manager templates (ARM templates), and
Bicep
Configure host pool and session host settings
Apply a Windows client or Windows Server license to a session host
Create and manage session host images
Create a golden image manually
Create a golden image by using Azure VM Image Builder
Modify a session host image
Plan and implement lifecycle management for images
Apply OS and application updates to an image
Create a session host by using a golden image
Plan and implement image storage
Create and manage Azure Compute Gallery
Plan and implement identity and security (15–20%) Plan and implement identity integration
Choose an identity management and authentication method
Identify Azure Virtual Desktop requirements for Active Directory Domain Services
(AD DS), Microsoft Entra Domain Services, and Microsoft Entra ID
Plan and implement Azure roles and role-based access control (RBAC) for Azure
Virtual Desktop
Plan and implement Microsoft Entra Conditional Access policies for connections
to Azure Virtual Desktop
Plan and implement multifactor authentication in Azure Virtual Desktop
Manage roles, groups, and rights assignments on Azure Virtual Desktop session
hosts
Plan and implement security
Plan, implement, and manage security for Azure Virtual Desktop session hosts by
using Microsoft Defender for Cloud
Configure Microsoft Defender Antivirus for session hosts
Implement and manage network security for connections to Azure Virtual Desktop
Configure Azure Bastion or just-in-time (JIT) for administrative access to
session hosts
Plan and implement Windows Threat Protection features on Azure Virtual Desktop
session hosts, including Windows Defender Application Control
Plan and implement user environments and apps (20–25%) Plan and implement FSLogix
Recommend FSLogix configuration
Install and configure FSLogix
Configure Profile Containers
Configure Office Containers
Configure Cloud Cache
Plan and implement user experience and client settings
Choose an Azure Virtual Desktop client and deployment method
Deploy and troubleshoot Azure Virtual Desktop clients
Configure device redirection
Configure printing and Universal Print
Configure user settings through Group Policy and Microsoft Intune policies
Configure Remote Desktop Protocol (RDP) properties on a host pool
Configure session timeout properties
Implement the Start Virtual Machine on Connect feature
Assign and unassign personal desktops for users
Install and configure apps on a session host
Choose a method for deploying an app to Azure Virtual Desktop
Configure dynamic application delivery by using MSIX app attach
Publish an application as a RemoteApp
Implement FSLogix application masking
Implement and manage OneDrive, including multisession environments
Implement and manage Microsoft Teams, including AV redirect
Implement and manage Microsoft 365 apps on Azure Virtual Desktop session hosts
Implement and manage browsers for Azure Virtual Desktop sessions
Create and configure an application group
Assign users to application groups
Monitor and maintain an Azure Virtual Desktop infrastructure (10–15%) Monitor and manage Azure Virtual Desktop services
Configure log collection and analysis for Azure Virtual Desktop session hosts
Configure Azure Virtual Desktop monitoring by using Azure Monitor
Monitor Azure Virtual Desktop by using Azure Monitor
Customize Azure Monitor workbooks for Azure Virtual Desktop monitoring
Monitor Azure Virtual Desktop by using Azure Advisor
Optimize session host capacity and performance
Implement autoscaling in host pools
Monitor and manage active sessions and application groups
Configure automation for Azure Virtual Desktop
Automate management of host pools, session hosts, and user sessions by using
PowerShell and Azure CLI
Plan, implement, and maintain business continuity
Recommend an update strategy for session hosts
Plan and implement a disaster recovery plan for Azure Virtual Desktop
Plan for multi-region implementation
Design a backup strategy for Azure Virtual Desktop
Configure backup and restore for FSLogix user profiles, personal virtual desktop
infrastructures (VDIs), and golden images
Sample Question and Answers
QUESTION 1 You plan to implement the FSLogix profile containers for the Seattle office.
Which storage account should you use?
A. storage2
B. storage4
C. storage3
D. storage1
Answer: A
Explanation:
QUESTION 2 Which role should you assign to Operator2 to meet the technical
requirements?
A. Desktop Virtualization Session Host Operator
B. Desktop Virtualization Host Pool Contributor
C. Desktop Virtualization User Session Operator
D. Desktop Virtualization Contributor
Answer: D
QUESTION 3 You need to configure the device redirection settings. The solution must
meet the technical requirements.
Where should you configure the settings?
A. Workspace1
B. MontrealUsers
C. Group1
D. Pool1
Answer: D
QUESTION 4 You need to configure the virtual machines that have the Pool1 prefix. The
solution must meet the technical requirements.
What should you use?
A. Windows Virtual Desktop automation task
B. Virtual machine auto-shutdown
C. Service Health in Azure Monitor
D. Azure Automation
Answer: A
QUESTION 5 Which setting should you modify for VNET4 before you can deploy Pool4?
A. Service endpoints
B. Address space
C. DNS servers
D. Access control (1AM)
E. Peerings
Answer: C
QUESTION 6 Which three PowerShell modules should you install on Server1 to meet the
technical requirements?
Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Pester
B. RemoteDesktop
C. ServerManager
D. ActiveDirectory
E. Hyper-V
Answer: ADE
Students Reviews / Discussion
Mike 3 months, 3 weeks ago - UGanda
430/500, the 60 questions are within the 100 questions of this dump.
upvoted 1 times
Mohammed Deeia 4 months, 1 week ago - Kuwait
Above 90% of questions are valid until now but review all the questions, a some
of them have wrong answers.
upvoted 1 times
SHARON GRUBER 6 months, 1 week ago - United States
They`re valid. 465/500 passing score. Thanks
upvoted 2 times
Lembede 6 months, 2 weeks ago - South Africa
They`re valid. 439/500 passing score. Thanks ! DYOR
upvoted 1 times
Sharan Telukunta 9 months, 3 weeks ago - New Jersey
Passed with 448/500. Do independent research on all questions.
upvoted 1 times
Term moxa 10 months ago - Baltimore
Passed with 439/500. Do independent research on all questions.
upvoted 1 times
Christoph Spirig 10 months, 1 week ago - Switzerland
Passed with 457/500 Dump is valid, Check the discussions and search in Internet
upvoted 1 times
Mama Brien10 months, 1 week ago - Singapore
Passed with 450/ 500 , All questions are in this dump. Check the discussions and
do your research.
upvoted 1 times
700-760 SAAM Security Architecture for Account Managers
Duration: 90 minutes
Languages: English
Exam overview
This exam is for the required knowledge across the Cisco Security portfolio for
a registered partner organization to obtain the Security specialization in the
AM role.
Exam topics The following topics are general guidelines for the content likely to be
included on the exam. However, other related topics may also appear on any
specific delivery of the exam. In order to better reflect the contents of the
exam and for clarity purposes, the guidelines below may change at any time
without notice.
20% 1.0 Threat Landscape and Security Issues 1.1 Identify the role of digitization in cyber security
1.2 Identify cyber security challenges
1.3 Identify causes of fragmented security
1.4 Identify security opportunities and obstacles
15% 2.0 Selling Cisco Security 2.1 Identify how Cisco supports practice development
2.2 Identify areas of the Cisco security portfolio
2.3 Identify Cisco programs for partner support
2.4 Identify Cisco programs for partner profitability
15% 3.0 Customer Conversations 3.1 Identify Cisco portfolio components
3.2 Identify Cisco security solutions
3.3 Identify customer security challenges
3.4 Identify components of Cisco’s best-in-class technology
15% 4.0 IoT Security
4.1 Identify IoT solutions critical to business
4.2 Identify the evolution of and need for IoT security
4.3 Identify how Cisco IoT Security solutions provide layered protection
4.4 Identify components of Cisco’s IoT security
15% 5.0 Cisco Zero Trust 5.1 Identify the value of and drivers for trust-centric security
5.2 Identify the value of zero trust
5.3 Identify zero-trust solutions
5.4 Identify Cisco Zero Trust outcomes
20% 6.0 Cisco Security Solutions Portfolio 6.1 Identify the requirements of modern network environments
6.2 Identify the challenges of next generation networks
6.3 Identify Cisco solutions for next generation network security
6.4 Identify Cisco security solution sets
Exam preparation Official Cisco training
Security Architecture for Account Managers Cisco Partners can access Sales Connect for access to a variety of training
content to help them prepare for this exam.
QUESTION 1
Which component of StealthWatch uses sophisticated security analytics to accelerate threat
response times
A. Network control
B. Investigation
C. Anomaly detection
D. Threat protection
Answer: B
QUESTION 2
Which three products are Cisco Visibility & Enforcement solutions (Choose three.)
A. Web Security
B. AnyConnect
C. TrustSec
D. Identity Services Engine (ISE)
E. Next-Generation Firewalls (NGFW)
F. Next-Generation Intrusion Prevention System (NGIPS)
G. Advanced Malware Protection (AMP) for Endpoints
Answer: B,C,D
QUESTION 3
Which three products are in Cisco's Web & Email Security portfolio (Choose three.)
A. Meraki
B. ESA
C. Investigate
D. WSA
E. Umbrella
F. CES
Answer: B,D,F
QUESTION 4
What is used to reduce attack surfaces
A. Access
B. Remediation
C. Segmentation
D. Device trust
Answer: C
QUESTION 5
Which two benefits of flexible purchasing does Cisco provide (Choose two.)
A. Simplify the datacenter
B. Meet immediate needs
C. Roll out more solutions
D. Plan for the future
E. Reduce training needs