Download Oracle.1z0-449.BrainDumps.2018-11-29.42q.tqb

Download Exam

File Info

Exam Oracle Big Data 2016 Implementation Essentials
Number 1z0-449
File Name Oracle.1z0-449.BrainDumps.2018-11-29.42q.tqb
Size 680 KB
Posted Nov 29, 2018
Download Oracle.1z0-449.BrainDumps.2018-11-29.42q.tqb

How to open VCEX & EXAM Files?

Files with VCEX & EXAM extensions can be opened by ProfExam Simulator.

Purchase

Coupon: MASTEREXAM
With discount: 20%






Demo Questions

Question 1

The hdfs_stream script is used by the Oracle SQL Connector for HDFS to perform a specific task to access data. 
What is the purpose of this script? 


  1. It is the preprocessor script for the Impala table.
  2. It is the preprocessor script for the HDFS external table.
  3. It is the streaming script that creates a database directory.
  4. It is the preprocessor script for the Oracle partitioned table.
  5. It defines the jar file that points to the directory where Hive is installed.
Correct answer: B
Explanation:
The hdfs_stream script is the preprocessor for the Oracle Database external table created by Oracle SQL Connector for HDFS. References: https://docs.oracle.com/cd/E37231_01/doc.20/e36961/start.htm#BDCUG107
The hdfs_stream script is the preprocessor for the Oracle Database external table created by Oracle SQL Connector for HDFS. 
References: https://docs.oracle.com/cd/E37231_01/doc.20/e36961/start.htm#BDCUG107



Question 2

How should you encrypt the Hadoop data that sits on disk?


  1. Enable Transparent Data Encryption by using the Mammoth utility.
  2. Enable HDFS Transparent Encryption by using bdacli on a Kerberos-secured cluster.
  3. Enable HDFS Transparent Encryption on a non-Kerberos secured cluster.
  4. Enable Audit Vault and Database Firewall for Hadoop by using the Mammoth utility.
Correct answer: B
Explanation:
HDFS Transparent Encryption protects Hadoop data that’s at rest on disk. When the encryption is enabled for a cluster, data write and read operations on encrypted zones (HDFS directories) on the disk are automatically encrypted and decrypted. This process is “transparent” because it’s invisible to the application working with the data. The cluster where you want to use HDFS Transparent Encryption must have Kerberos enabled. Incorrect Answers:D: The cluster where you want to use HDFS Transparent Encryption must have Kerberos enabled.References: https://docs.oracle.com/en/cloud/paas/big-data-cloud/csbdi/using-hdfs-transparent-encryption.html#GUID-16649C5A-2C88-4E75-809A-BBF8DE250EA3
HDFS Transparent Encryption protects Hadoop data that’s at rest on disk. When the encryption is enabled for a cluster, data write and read operations on encrypted zones (HDFS directories) on the disk are automatically encrypted and decrypted. This process is “transparent” because it’s invisible to the application working with the data. 
The cluster where you want to use HDFS Transparent Encryption must have Kerberos enabled. 
Incorrect Answers:
D: The cluster where you want to use HDFS Transparent Encryption must have Kerberos enabled.
References: https://docs.oracle.com/en/cloud/paas/big-data-cloud/csbdi/using-hdfs-transparent-encryption.html#GUID-16649C5A-2C88-4E75-809A-BBF8DE250EA3



Question 3

What two things does the Big Data SQL push down to the storage cell on the Big Data Appliance? (Choose two.)


  1. Transparent Data Encrypted data
  2. the column selection of data from individual Hadoop nodes
  3. WHERE clause evaluations
  4. PL/SQL evaluation
  5. Business Intelligence queries from connected Exalytics servers
Correct answer: AB









CONNECT US

Facebook

Twitter

PROFEXAM WITH A 20% DISCOUNT

You can buy ProfExam with a 20% discount!



HOW TO OPEN VCEX FILES

Use ProfExam Simulator to open VCEX files