Download Microsoft.AI-100.Prep4Sure.2019-09-27.56q.vcex

Download Exam

File Info

Exam Designing and Implementing an Azure AI Solution
Number AI-100
File Name Microsoft.AI-100.Prep4Sure.2019-09-27.56q.vcex
Size 973 KB
Posted Sep 27, 2019
Download Microsoft.AI-100.Prep4Sure.2019-09-27.56q.vcex

How to open VCEX & EXAM Files?

Files with VCEX & EXAM extensions can be opened by ProfExam Simulator.

Purchase

Coupon: MASTEREXAM
With discount: 20%






Demo Questions

Question 1

You have an Azure Machine Learning model that is deployed to a web service. 
You plan to publish the web service by using the name ml.contoso.com. 
You need to recommend a solution to ensure that access to the web service is encrypted. 
Which three actions should you recommend? Each correct answer presents part of the solution. 
NOTE: Each correct selection is worth one point.


  1. Generate a shared access signature (SAS)
  2. Obtain an SSL certificate
  3. Add a deployment slot
  4. Update the web service
  5. Update DNS
  6. Create an Azure Key Vault
Correct answer: BDE
Explanation:
The process of securing a new web service or an existing one is as follows:Get a domain name. Get a digital certificate. Deploy or update the web service with the SSL setting enabled. Update your DNS to point to the web service. Note: To deploy (or re-deploy) the service with SSL enabled, set the ssl_enabled parameter to True, wherever applicable. Set the ssl_certificate parameter to the value of the certificate file and the ssl_key to the value of the key file. References:https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-secure-web-service
The process of securing a new web service or an existing one is as follows:
  1. Get a domain name. 
  2. Get a digital certificate. 
  3. Deploy or update the web service with the SSL setting enabled. 
  4. Update your DNS to point to the web service. 
Note: To deploy (or re-deploy) the service with SSL enabled, set the ssl_enabled parameter to True, wherever applicable. Set the ssl_certificate parameter to the value of the certificate file and the ssl_key to the value of the key file. 
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-secure-web-service



Question 2

Your company recently deployed several hardware devices that contain sensors. 
The sensors generate new data on an hourly basis. The data generated is stored on-premises and retained for several years. 
During the past two months, the sensors generated 300 GB of data. 
You plan to move the data to Azure and then perform advanced analytics on the data. 
You need to recommend an Azure storage solution for the data. 
Which storage solution should you recommend?


  1. Azure Queue storage
  2. Azure Cosmos DB
  3. Azure Blob storage
  4. Azure SQL Database
Correct answer: C
Explanation:
References:https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage



Question 3

You plan to design an application that will use data from Azure Data Lake and perform sentiment analysis by using Azure Machine Learning algorithms. 
The developers of the application use a mix of Windows- and Linux-based environments. The developers contribute to shared GitHub repositories. 
You need all the developers to use the same tool to develop the application. 
What is the best tool to use? More than one answer choice may achieve the goal.


  1. Microsoft Visual Studio Code 
  2. Azure Notebooks
  3. Azure Machine Learning Studio
  4. Microsoft Visual Studio
Correct answer: C
Explanation:
References:https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/machine-learning/studio/algorithm-choice.md
References:
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/machine-learning/studio/algorithm-choice.md



Question 4

You have several AI applications that use an Azure Kubernetes Service (AKS) cluster. The cluster supports a maximum of 32 nodes. 
You discover that occasionally and unpredictably, the application requires more than 32 nodes. 
You need to recommend a solution to handle the unpredictable application load. 
Which scaling method should you recommend?


  1. horizontal pod autoscaler
  2. cluster autoscaler
  3. manual scaling
  4. Azure Container Instances
Correct answer: B
Explanation:
To keep up with application demands in Azure Kubernetes Service (AKS), you may need to adjust the number of nodes that run your workloads. The cluster autoscaler component can watch for pods in your cluster that can't be scheduled because of resource constraints. When issues are detected, the number of nodes is increased to meet the application demand. Nodes are also regularly checked for a lack of running pods, with the number of nodes then decreased as needed. This ability to automatically scale up or down the number of nodes in your AKS cluster lets you run an efficient, cost-effective cluster. References:https://docs.microsoft.com/en-us/azure/aks/cluster-autoscaler
To keep up with application demands in Azure Kubernetes Service (AKS), you may need to adjust the number of nodes that run your workloads. The cluster autoscaler component can watch for pods in your cluster that can't be scheduled because of resource constraints. When issues are detected, the number of nodes is increased to meet the application demand. Nodes are also regularly checked for a lack of running pods, with the number of nodes then decreased as needed. 
This ability to automatically scale up or down the number of nodes in your AKS cluster lets you run an efficient, cost-effective cluster. 
References:
https://docs.microsoft.com/en-us/azure/aks/cluster-autoscaler



Question 5

You deploy an infrastructure for a big data workload. 
You need to run Azure HDInsight and Microsoft Machine Learning Server. You plan to set the RevoScaleR compute contexts to run rx function calls in parallel. 
What are three compute contexts that you can use for Machine Learning Server? Each correct answer presents a complete solution. 
NOTE: Each correct selection is worth one point.


  1. SQL
  2. Spark
  3. local parallel
  4. HBase
  5. local sequential
Correct answer: ABC
Explanation:
Remote computing is available for specific data sources on selected platforms. The following tables document the supported combinations. RxInSqlServer, sqlserver: Remote compute context. Target server is a single database node (SQL Server 2016 R Services or SQL Server 2017 MachineLearning Services). Computation is parallel, but not distributed. RxSpark, spark: Remote compute context. Target is a Spark cluster on Hadoop.RxLocalParallel, localpar: Compute context is often used to enable controlled, distributed computations relying on instructions you provide rather than a built-in scheduler on Hadoop. You can use compute context for manual distributed computing. References:https://docs.microsoft.com/en-us/machine-learning-server/r/concept-what-is-compute-context
Remote computing is available for specific data sources on selected platforms. The following tables document the supported combinations. 
  • RxInSqlServer, sqlserver: Remote compute context. Target server is a single database node (SQL Server 2016 R Services or SQL Server 2017 Machine
    Learning Services). Computation is parallel, but not distributed. 
  • RxSpark, spark: Remote compute context. Target is a Spark cluster on Hadoop.
  • RxLocalParallel, localpar: Compute context is often used to enable controlled, distributed computations relying on instructions you provide rather than a built-in scheduler on Hadoop. You can use compute context for manual distributed computing. 
References:
https://docs.microsoft.com/en-us/machine-learning-server/r/concept-what-is-compute-context



Question 6

Your company has 1,000 AI developers who are responsible for provisioning environments in Azure. 
You need to control the type, size, and location of the resources that the developers can provision. 
What should you use?


  1. Azure Key Vault
  2. Azure service principals 
  3. Azure managed identities
  4. Azure Security Center
  5. Azure Policy
Correct answer: B
Explanation:
When an application needs access to deploy or configure resources through Azure Resource Manager in Azure Stack, you create a service principal, which is a credential for your application. You can then delegate only the necessary permissions to that service principal. References:https://docs.microsoft.com/en-us/azure/azure-stack/azure-stack-create-service-principals
When an application needs access to deploy or configure resources through Azure Resource Manager in Azure Stack, you create a service principal, which is a credential for your application. You can then delegate only the necessary permissions to that service principal. 
References:
https://docs.microsoft.com/en-us/azure/azure-stack/azure-stack-create-service-principals



Question 7

You have a solution that runs on a five-node Azure Kubernetes Service (AKS) cluster. The cluster uses an N-series virtual machine. 
An Azure Batch AI process runs once a day and rarely on demand. 
You need to recommend a solution to maintain the cluster configuration when the cluster is not in use. The solution must not incur any compute costs. 
What should you include in the recommendation?


  1. Downscale the cluster to one node
  2. Downscale the cluster to zero nodes
  3. Delete the cluster
Correct answer: A
Explanation:
An AKS cluster has one or more nodes. References:https://docs.microsoft.com/en-us/azure/aks/concepts-clusters-workloads
An AKS cluster has one or more nodes. 
References:
https://docs.microsoft.com/en-us/azure/aks/concepts-clusters-workloads



Question 8

Your company has recently deployed 5,000 Internet-connected sensors for a planned AI solution. 
You need to recommend a computing solution to perform a real-time analysis of the data generated by the sensors. 
Which computing solution should you recommend?


  1. an Azure HDInsight Storm cluster
  2. Azure Notification Hubs
  3. an Azure HDInsight Hadoop cluster
  4. an Azure HDInsight R cluster
Correct answer: C
Explanation:
Azure HDInsight makes it easy, fast, and cost-effective to process massive amounts of data. You can use HDInsight to process streaming data that's received in real time from a variety of devices. References:https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/apache-hadoop-introduction
Azure HDInsight makes it easy, fast, and cost-effective to process massive amounts of data. 
You can use HDInsight to process streaming data that's received in real time from a variety of devices. 
References:
https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/apache-hadoop-introduction



Question 9

You deploy an application that performs sentiment analysis on the data stored in Azure Cosmos DB. 
Recently, you loaded a large amount of data to the database. The data was for a customer named Contoso, Ltd. 
You discover that queries for the Contoso data are slow to complete, and the queries slow the entire application. 
You need to reduce the amount of time it takes for the queries to complete. The solution must minimize costs. 
What is the best way to achieve the goal? More than one answer choice may achieve the goal. Select the BEST answer.


  1. Change the request units.
  2. Change the partitioning strategy.
  3. Change the transaction isolation level.
  4. Migrate the data to the Cosmos DB database. 
Correct answer: B
Explanation:
References:https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning
References:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning



Question 10

You have an AI application that uses keys in Azure Key Vault. 
Recently, a key used by the application was deleted accidentally and was unrecoverable. 
You need to ensure that if a key is deleted, it is retained in the key vault for 90 days. 
Which two features should you configure? Each correct answer presents part of the solution. 
NOTE: Each correct selection is worth one point.


  1. The expiration date on the keys
  2. Soft delete
  3. Purge protection
  4. Auditors
  5. The activation date on the keys
Correct answer: BC
Explanation:
References:https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning
References:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning









CONNECT US

Facebook

Twitter

PROFEXAM WITH A 20% DISCOUNT

You can buy ProfExam with a 20% discount!



HOW TO OPEN VCEX FILES

Use ProfExam Simulator to open VCEX files