DP-300試験無料問題集「Microsoft Administering Relational Databases on Microsoft Azure 認定」

You have an Azure SQL managed instance named Server1 and an Azure Blob Storage account named storage1 that contains Microsoft SQL Server database backup files.
You plan to use Log Replay Service to migrate the backup files from storage1 to Server1. The solution must use the highest level of security when connecting to storage1.
Which PowerShell cmdlet should you run, and which parameter should you specify to secure the connection?
To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
正解:

Explanation:
You have an Azure subscription that uses a domain named contoso.com.
You have two Azure VMs named DBServer1 and DBServer2. Each of them hosts a default SQL Server instance. DBServer1 is in the East US Azure region and contains a database named DatabaseA. DBServer2 is in the West US Azure region.
DBServer1 has a high volume of data changes and low latency requirements for data writes.
You need to configure a new availability group for DatabaseA. The secondary replica will reside on DBServer2.
What should you do?

解説: (GoShiken メンバーにのみ表示されます)
You need to implement authentication for ResearchDB1. The solution must meet the security and compliance requirements.
What should you run as part of the implementation?

解説: (GoShiken メンバーにのみ表示されます)
You have an Azure subscription that contains an Azure SQL managed instance, a database named db1, and an Azure web app named Appl. Appl uses db1.
You need to enable Resource Governor for a App1. The solution must meet the following requirements:
App1 must be able to consume all available CPU resources.
App1 must have at least half of the available CPU resources always available.
Which three actions should you perform in sequence? To answer. move the appropriate actions from the list of actions to the answer area and arrange them in the correct order NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select.
正解:

Explanation:
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Data Lake Storage account that contains a staging zone.
You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics.
Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes an Azure Databricks notebook, and then inserts the data into the data warehouse.
Does this meet the goal?

You need to implement the monitoring of SalesSQLDb1. The solution must meet the technical requirements.
How should you collect and stream metrics? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
正解:

Explanation:

Box 1: The server, the elastic pool, and the database
Senario:
SalesSQLDb1 is in an elastic pool named SalesSQLDb1Pool.
Litware technical requirements include: all SQL Server and Azure SQL Database metrics related to CPU and storage usage and limits must be analyzed by using Azure built-in functionality.
Box 2: Azure Event hubs
Scenario: Migrate ManufacturingSQLDb1 to the Azure virtual machine platform.
Event hubs are able to handle custom metrics.
Topic 3, ADatum Corporation
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Overview
ADatum Corporation is a retailer that sells products through two sales channels: retail stores and a website.
Existing Environment
ADatum has one database server that has Microsoft SQL Server 2016 installed. The server hosts three mission-critical databases named SALESDB, DOCDB, and REPORTINGDB.
SALESDB collects data from the stores and the website.
DOCDB stores documents that connect to the sales data in SALESDB. The documents are stored in two different JSON formats based on the sales channel.
REPORTINGDB stores reporting data and contains several columnstore indexes. A daily process creates reporting data in REPORTINGDB from the data in SALESDB. The process is implemented as a SQL Server Integration Services (SSIS) package that runs a stored procedure from SALESDB.
Requirements
Planned Changes
ADatum plans to move the current data infrastructure to Azure. The new infrastructure has the following requirements:
* Migrate SALESDB and REPORTINGDB to an Azure SQL database.
* Migrate DOCDB to Azure Cosmos DB.
* The sales data, including the documents in JSON format, must be gathered as it arrives and analyzed online by using Azure Stream Analytics. The analytics process will perform aggregations that must be done continuously, without gaps, and without overlapping.
* As they arrive, all the sales documents in JSON format must be transformed into one consistent format.
* Azure Data Factory will replace the SSIS process of copying the data from SALESDB to REPORTINGDB.
Technical Requirements
The new Azure data infrastructure must meet the following technical requirements:
* Data in SALESDB must encrypted by using Transparent Data Encryption (TDE). The encryption must use your own key.
* SALESDB must be restorable to any given minute within the past three weeks.
* Real-time processing must be monitored to ensure that workloads are sized properly based on actual usage patterns.
* Missing indexes must be created automatically for REPORTINGDB.
* Disk IO, CPU, and memory usage must be monitored for SALESDB.
Task 10
You need to protect all the databases on sql37006S95 from SQL injection attacks.
正解:
See the explanation part for the complete Solution.
Explanation:
SQL injection attacks are a type of cyberattack that exploit a vulnerability in the application code that interacts with the database. An attacker can inject malicious SQL statements into the user input, such as a form field or a URL parameter, and execute them on the database server, resulting in data theft, corruption, or unauthorized access1.
To protect all the databases on sql37006S95 from SQL injection attacks, you need to follow some best practices for securing your application and database layers. Here are some of the recommended steps:
* Use parameterized queries or stored procedures to separate the SQL code from the user input. This will prevent the user input from being interpreted as part of the SQL statement and avoid SQL injection23.
* Validate and sanitize the user input before passing it to the database. This will ensure that the input conforms to the expected format and type, and remove any potentially harmful characters or keywords4.
* Implement least privilege access for the database users and roles. This will limit the permissions and actions that the application can perform on the database, and reduce the impact of a successful SQL injection attack5.
* Enable Advanced Threat Protection for Azure SQL Database. This is a feature that detects and alerts you of anomalous activities and potential threats on your database, such as SQL injection, brute force attacks, or unusual access patterns. You can configure the alert settings and notifications using the Azure portal or PowerShell.
These are some of the steps to protect all the databases on sql37006S95 from SQL injection attacks.
You have an instance of SQL Server on Azure Virtual Machines named VM1.
You need to use an Azure Automation runbook to initiate a SQL Server database backup on VM1.
How should you complete the command? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
正解:

Explanation:
You have an Azure subscription.
You plan to deploy an Azure SQL database by using an Azure Resource Manager template.
How should you complete the template? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
正解:

Explanation:

Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/database/single-database-create-arm-template-quickstart
You have an Azure subscription that contains the following resources:
* 10 Azure SQL databases
* Five Azure SQL managed instances
* Five instances of SQL Server on Azure Virtual Machines
You need to implement a centralized monitoring solution for all the Azure SQL resources. The solution must minimize administrative effort. What should you include in the solution?

You have an on-premises app named App1 that stores data in an on-premises Microsoft SQL Server 2016 database named DB1.
You plan to deploy additional instances of App1 to separate Azure regions. Each region will have a separate instance of App1 and DB1. The separate instances of DB1 will sync by using Azure SQL Data Sync.
You need to recommend a database service for the deployment. The solution must minimize administrative effort.
What should you include in the recommendation?

解説: (GoShiken メンバーにのみ表示されます)
You have a new Azure SQL database. The database contains a column that stores confidential information.
You need to track each time values from the column are returned in a query. The tracking information must be stored for 365 days from the date the query was executed.
Which three actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

正解:A,B,D 解答を投票する
解説: (GoShiken メンバーにのみ表示されます)
You have SQL Server 2019 or an Azure virtual machine that runs Windows Server 2019. The virtual machine has 4 vCPUs and 28 GB of memory.
Vou scale up the virtual machine to 8 vCPUs and 64 GB of memory.
You need to reduce tempdb contention without regatively affecting server performance.
What is the number of secondary data files that you should configure for tempdb?

You need to implement the surrogate key for the retail store table. The solution must meet the sales transaction dataset requirements.
What should you create?

解説: (GoShiken メンバーにのみ表示されます)
You have an Azure SQL database. The database contains a table that uses a columnstore index and is accessed infrequently.
You enable columnstore archival compression.
What are two possible results of the configuration? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

解説: (GoShiken メンバーにのみ表示されます)
You plan to deploy two instances of SQL Server on Azure virtual machines in a highly available configuration that will use an Always On availability group.
You need to recommend a deployment solution that meets the following requirements:
* Provides a Service Level Agreement (SLA) of at least 99.95%
* Replicates databases in the same group synchronously
* Minimizes the latency of database writes
What should you recommend?

解説: (GoShiken メンバーにのみ表示されます)
You are planning a solution that will use Azure SQL Database. Usage of the solution will peak from October 1 to January 1 each year.
During peak usage, the database will require the following:
* 24 cores
* 500 GB of storage
* 124 GB of memory
* More than 50,000 IOPS
During periods of off-peak usage, the service tier of Azure SQL Database will be set to Standard.
Which service tier should you use during peak usage?

解説: (GoShiken メンバーにのみ表示されます)
You have an Azure subscription that contains the resources shown in the following table.

You need to create a read-only replica of DB1 and configure the App1 instances to use the replica.
What should you do? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
正解:

Explanation:

Reference:
https://sqlserverguides.com/read-only-replica-azure-sql/
You have an Azure SQL Database instance named DatabaseA on a server named Server1.
You plan to add a new user named App1 to DatabaseA and grant App1 db_datacenter permissions. App1 will use SQL Server Authentication.
You need to create App1. The solution must ensure that App1 can be given access to other databases by using the same credentials.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
正解:

Explanation:

Step 1: On the master database, run CREATE LOGIN [App1] WITH PASSWORD = 'p@aaW0rd!' Logins are server wide login and password pairs, where the login has the same password across all databases.
Here is some sample Transact-SQL that creates a login:
CREATE LOGIN readonlylogin WITH password='1231!#ASDF!a';
You must be connected to the master database on SQL Azure with the administrative login (which you get from the SQL Azure portal) to execute the CREATE LOGIN command.
Step 2: On DatabaseA, run CREATE USER [App1] FROM LOGIN [App1]
Users are created per database and are associated with logins. You must be connected to the database in where you want to create the user. In most cases, this is not the master database. Here is some sample Transact-SQL that creates a user:
CREATE USER readonlyuser FROM LOGIN readonlylogin;
Step 3: On DatabaseA run ALTER ROLE db_datareader ADD Member [App1]
Just creating the user does not give them permissions to the database. You have to grant them access. In the Transact-SQL example below the readonlyuser is given read only permissions to the database via the db_datareader role.
EXEC sp_addrolemember 'db_datareader', 'readonlyuser';
Reference:
https://azure.microsoft.com/en-us/blog/adding-users-to-your-sql-azure-database/