Download Microsoft.AI-100.PrepAway.2021-02-17.156q.vcex

Download Exam

File Info

Exam Designing and Implementing an Azure AI Solution
Number AI-100
File Name Microsoft.AI-100.PrepAway.2021-02-17.156q.vcex
Size 5 MB
Posted Feb 17, 2021
Download Microsoft.AI-100.PrepAway.2021-02-17.156q.vcex

How to open VCEX & EXAM Files?

Files with VCEX & EXAM extensions can be opened by ProfExam Simulator.

Purchase

Coupon: MASTEREXAM
With discount: 20%






Demo Questions

Question 1

You are designing an application to parse images of business forms and upload the data to a database. 
The upload process will occur once a week. 
You need to recommend which services to use for the application. The solution must minimize infrastructure costs. 
Which services should you recommend? To answer, select the appropriate options in the answer area. 
NOTE: Each correct selection is worth one point.


Correct answer: To work with this question, an Exam Simulator is required.
Explanation:
Box 1: Azure Cognitive ServicesAzure Cognitive Services include image-processing algorithms to smartly identify, caption, index, and moderate your pictures and videos. Not: Azure Linguistic Analytics API, which provides advanced natural language processing over raw text.Box 2: Azure Data FactoryThe Azure Data Factory (ADF) is a service designed to allow developers to integrate disparate data sources.  It is a platform somewhat like SSIS in the cloud to manage the data you have both on-prem and in the cloud. It provides access to on-premises data in SQL Server and cloud data in Azure Storage (Blob and Tables) and Azure SQL Database. References:https://azure.microsoft.com/en-us/services/cognitive-services/https://www.jamesserra.com/archive/2014/11/what-is-azure-data-factory/
Box 1: Azure Cognitive Services
Azure Cognitive Services include image-processing algorithms to smartly identify, caption, index, and moderate your pictures and videos. 
Not: Azure Linguistic Analytics API, which provides advanced natural language processing over raw text.
Box 2: Azure Data Factory
The Azure Data Factory (ADF) is a service designed to allow developers to integrate disparate data sources.  It is a platform somewhat like SSIS in the cloud to manage the data you have both on-prem and in the cloud. 
It provides access to on-premises data in SQL Server and cloud data in Azure Storage (Blob and Tables) and Azure SQL Database. 
References:
https://azure.microsoft.com/en-us/services/cognitive-services/
https://www.jamesserra.com/archive/2014/11/what-is-azure-data-factory/



Question 2

You plan to deploy an Azure Data Factory pipeline that will perform the following:
  • Move data from on-premises to the cloud. 
  • Consume Azure Cognitive Services APIs. 
You need to recommend which technologies the pipeline should use. The solution must minimize custom code. 
What should you include in the recommendation? To answer, select the appropriate options in the answer area. 
NOTE: Each correct selection is worth one point.


Correct answer: To work with this question, an Exam Simulator is required.
Explanation:
Box 1: Self-hosted Integration RuntimeA self-hosted IR is capable of running copy activity between a cloud data stores and a data store in private network. Not Azure-SSIS Integration Runtime, as you would need to write custom code. Box 2: Azure Logic AppsAzure Logic Apps helps you orchestrate and integrate different services by providing 100+ ready-to-use connectors, ranging from on-premises SQL Server or SAP to Microsoft Cognitive Services. Incorrect:Not Azure API Management: Use Azure API Management as a turnkey solution for publishing APIs to external and internal customers. References:https://docs.microsoft.com/en-us/azure/data-factory/concepts-integration-runtimehttps://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-examples-and-scenarios
Box 1: Self-hosted Integration Runtime
A self-hosted IR is capable of running copy activity between a cloud data stores and a data store in private network. 
Not Azure-SSIS Integration Runtime, as you would need to write custom code. 
Box 2: Azure Logic Apps
Azure Logic Apps helps you orchestrate and integrate different services by providing 100+ ready-to-use connectors, ranging from on-premises SQL Server or SAP to Microsoft Cognitive Services. 
Incorrect:
Not Azure API Management: Use Azure API Management as a turnkey solution for publishing APIs to external and internal customers. 
References:
https://docs.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime
https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-examples-and-scenarios



Question 3

You need to build an interactive website that will accept uploaded images, and then ask a series of predefined questions based on each image. 
Which services should you use? To answer, select the appropriate options in the answer area. 
NOTE: Each correct selection is worth one point.


Correct answer: To work with this question, an Exam Simulator is required.
Explanation:
Box 1: Azure Bot ServiceBox 2: Computer VisionThe Computer Vision Analyze an image feature, returns information about visual content found in an image. Use tagging, domain-specific models, and descriptions in four languages to identify content and label it with confidence. Use Object Detection to get location of thousands of objects within an image. Apply the adult/racy settings to help you detect potential adult content. Identify image types and color schemes in pictures. References:https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/
Box 1: Azure Bot Service
Box 2: Computer Vision
The Computer Vision Analyze an image feature, returns information about visual content found in an image. 
Use tagging, domain-specific models, and descriptions in four languages to identify content and label it with confidence. Use Object Detection to get location of thousands of objects within an image. Apply the adult/racy settings to help you detect potential adult content. Identify image types and color schemes in pictures. 
References:
https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/



Question 4

You are designing an AI solution that will analyze millions of pictures by using Azure HDInsight Hadoop cluster. 
You need to recommend a solution for storing the pictures. The solution must minimize costs. 
Which storage solution should you recommend?


  1. an Azure Data Lake Storage Gen1
  2. Azure File Storage
  3. Azure Blob storage
  4. Azure Table storage
Correct answer: C
Explanation:
Data Lake will be a bit more expensive although they are in close range of each other. Blob storage has more options for pricing depending upon things like how frequently you need to access your data (cold vs hot storage). Reference:http://blog.pragmaticworks.com/azure-data-lake-vs-azure-blob-storage-in-data-warehousing
Data Lake will be a bit more expensive although they are in close range of each other. Blob storage has more options for pricing depending upon things like how frequently you need to access your data (cold vs hot storage). 
Reference:
http://blog.pragmaticworks.com/azure-data-lake-vs-azure-blob-storage-in-data-warehousing



Question 5

You are configuring data persistence for a Microsoft Bot Framework application. The application requires a structured NoSQL cloud data store. 
You need to identify a storage solution for the application. The solution must minimize costs. 
What should you identify?


  1. Azure Blob storage
  2. Azure Cosmos DB
  3. Azure HDInsight
  4. Azure Table storage
Correct answer: D
Explanation:
Table Storage is a NoSQL key-value store for rapid development using massive semi-structured datasets You can develop applications on Cosmos DB using popular NoSQL APIs. Both services have a different scenario and pricing model. While Azure Storage Tables is aimed at high capacity on a single region (optional secondary read only region but no failover), indexing by PK/RK and storage-optimized pricing; Azure Cosmos DB Tables aims for high throughput (single-digit millisecond latency), global distribution (multiple failover), SLA-backed predictive performance with automatic indexing of each attribute/property and a pricing model focused on throughput. References:https://db-engines.com/en/system/Microsoft+Azure+Cosmos+DB%3BMicrosoft+Azure+Table+Storage
Table Storage is a NoSQL key-value store for rapid development using massive semi-structured datasets You can develop applications on Cosmos DB using popular NoSQL APIs. 
Both services have a different scenario and pricing model. 
While Azure Storage Tables is aimed at high capacity on a single region (optional secondary read only region but no failover), indexing by PK/RK and storage-optimized pricing; Azure Cosmos DB Tables aims for high throughput (single-digit millisecond latency), global distribution (multiple failover), SLA-backed predictive performance with automatic indexing of each attribute/property and a pricing model focused on throughput. 
References:
https://db-engines.com/en/system/Microsoft+Azure+Cosmos+DB%3BMicrosoft+Azure+Table+Storage



Question 6

You have an Azure Machine Learning model that is deployed to a web service. 
You plan to publish the web service by using the name ml.contoso.com. 
You need to recommend a solution to ensure that access to the web service is encrypted. 
Which three actions should you recommend? Each correct answer presents part of the solution. 
NOTE: Each correct selection is worth one point.


  1. Generate a shared access signature (SAS)
  2. Obtain an SSL certificate
  3. Add a deployment slot
  4. Update the web service
  5. Update DNS
  6. Create an Azure Key Vault
Correct answer: BDE
Explanation:
The process of securing a new web service or an existing one is as follows:Get a domain name. Get a digital certificate. Deploy or update the web service with the SSL setting enabled. Update your DNS to point to the web service. Note: To deploy (or re-deploy) the service with SSL enabled, set the ssl_enabled parameter to True, wherever applicable. Set the ssl_certificate parameter to the value of the certificate file and the ssl_key to the value of the key file. References:https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-secure-web-service
The process of securing a new web service or an existing one is as follows:
  1. Get a domain name. 
  2. Get a digital certificate. 
  3. Deploy or update the web service with the SSL setting enabled. 
  4. Update your DNS to point to the web service. 
Note: To deploy (or re-deploy) the service with SSL enabled, set the ssl_enabled parameter to True, wherever applicable. Set the ssl_certificate parameter to the value of the certificate file and the ssl_key to the value of the key file. 
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-secure-web-service



Question 7

Your company recently deployed several hardware devices that contain sensors. 
The sensors generate new data on an hourly basis. The data generated is stored on-premises and retained for several years. 
During the past two months, the sensors generated 300 GB of data. 
You plan to move the data to Azure and then perform advanced analytics on the data. 
You need to recommend an Azure storage solution for the data. 
Which storage solution should you recommend?


  1. Azure Queue storage
  2. Azure Cosmos DB
  3. Azure Blob storage
  4. Azure SQL Database
Correct answer: C
Explanation:
References:https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage



Question 8

You plan to design an application that will use data from Azure Data Lake and perform sentiment analysis by using Azure Machine Learning algorithms. 
The developers of the application use a mix of Windows- and Linux-based environments. The developers contribute to shared GitHub repositories. 
You need all the developers to use the same tool to develop the application. 
What is the best tool to use? More than one answer choice may achieve the goal.


  1. Microsoft Visual Studio Code
  2. Azure Notebooks
  3. Azure Machine Learning Studio
  4. Microsoft Visual Studio
Correct answer: C
Explanation:
References:https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/machine-learning/studio/algorithm-choice.md
References:
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/machine-learning/studio/algorithm-choice.md



Question 9

You have several AI applications that use an Azure Kubernetes Service (AKS) cluster. The cluster supports a maximum of 32 nodes. 
You discover that occasionally and unpredictably, the application requires more than 32 nodes. 
You need to recommend a solution to handle the unpredictable application load. 
Which scaling methods should you recommend? (Choose two.)


  1. horizontal pod autoscaler
  2. cluster autoscaler
  3. AKS cluster virtual 32 node autoscaling
  4. Azure Container Instances
Correct answer: AB
Explanation:
B: To keep up with application demands in Azure Kubernetes Service (AKS), you may need to adjust the number of nodes that run your workloads. The cluster autoscaler component can watch for pods in your cluster that can't be scheduled because of resource constraints. When issues are detected, the number of nodes is increased to meet the application demand. Nodes are also regularly checked for a lack of running pods, with the number of nodes then decreased as needed. This ability to automatically scale up or down the number of nodes in your AKS cluster lets you run an efficient, cost-effective cluster. A: You can also use the horizontal pod autoscaler to automatically adjust the number of pods that run your application. Reference:https://docs.microsoft.com/en-us/azure/aks/cluster-autoscaler
B: To keep up with application demands in Azure Kubernetes Service (AKS), you may need to adjust the number of nodes that run your workloads. The cluster autoscaler component can watch for pods in your cluster that can't be scheduled because of resource constraints. When issues are detected, the number of nodes is increased to meet the application demand. Nodes are also regularly checked for a lack of running pods, with the number of nodes then decreased as needed. This ability to automatically scale up or down the number of nodes in your AKS cluster lets you run an efficient, cost-effective cluster. 
A: You can also use the horizontal pod autoscaler to automatically adjust the number of pods that run your application. 
Reference:
https://docs.microsoft.com/en-us/azure/aks/cluster-autoscaler



Question 10

You deploy an infrastructure for a big data workload. 
You need to run Azure HDInsight and Microsoft Machine Learning Server. You plan to set the RevoScaleR compute contexts to run rx function calls in parallel. 
What are three compute contexts that you can use for Machine Learning Server? Each correct answer presents a complete solution. 
NOTE: Each correct selection is worth one point.


  1. SQL
  2. Spark
  3. local parallel
  4. HBase
  5. local sequential
Correct answer: ABC
Explanation:
Remote computing is available for specific data sources on selected platforms. The following tables document the supported combinations. RxInSqlServer, sqlserver: Remote compute context. Target server is a single database node (SQLServer 2016 R Services or SQL Server 2017 Machine Learning Services). Computation is parallel, but not distributed. RxSpark, spark: Remote compute context. Target is a Spark cluster on Hadoop.RxLocalParallel, localpar: Compute context is often used to enable controlled, distributed computations relying on instructions you provide rather than a built-in scheduler on Hadoop. You can use compute context for manual distributed computing. References:https://docs.microsoft.com/en-us/machine-learning-server/r/concept-what-is-compute-context
Remote computing is available for specific data sources on selected platforms. The following tables document the supported combinations. 
  • RxInSqlServer, sqlserver: Remote compute context. Target server is a single database node (SQLServer 2016 R Services or SQL Server 2017 Machine Learning Services). Computation is parallel, but not distributed. 
  • RxSpark, spark: Remote compute context. Target is a Spark cluster on Hadoop.
  • RxLocalParallel, localpar: Compute context is often used to enable controlled, distributed computations relying on instructions you provide rather than a built-in scheduler on Hadoop. You can use compute context for manual distributed computing. 
References:
https://docs.microsoft.com/en-us/machine-learning-server/r/concept-what-is-compute-context









CONNECT US

Facebook

Twitter

PROFEXAM WITH A 20% DISCOUNT

You can buy ProfExam with a 20% discount!



HOW TO OPEN VCEX FILES

Use ProfExam Simulator to open VCEX files