Google Professional-Cloud-Database-Engineer Dumps - Google Cloud Certified - Professional Cloud Database Engineer PDF Sample Questions

discount banner
Exam Code:
Professional-Cloud-Database-Engineer
Exam Name:
Google Cloud Certified - Professional Cloud Database Engineer
132 Questions
Last Update Date : 13 September, 2024
PDF + Test Engine
$60 $78
Test Engine Only Demo
$50 $65
PDF Only Demo
$35 $45.5

Google Professional-Cloud-Database-Engineer This Week Result

0

They can't be wrong

0

Score in Real Exam at Testing Centre

0

Questions came word by word from this dumps

Best Google Professional-Cloud-Database-Engineer Dumps - pass your exam In First Attempt

Our Professional-Cloud-Database-Engineer dumps are better than all other cheap Professional-Cloud-Database-Engineer study material.

Only best way to pass your Google Professional-Cloud-Database-Engineer is that if you will get reliable exam study materials. We ensure you that realexamdumps is one of the most authentic website for Google Cloud Database Engineer exam question answers. Pass your Professional-Cloud-Database-Engineer Google Cloud Certified - Professional Cloud Database Engineer with full confidence. You can get free Google Cloud Certified - Professional Cloud Database Engineer demo from realexamdumps. We ensure 100% your success in Professional-Cloud-Database-Engineer Exam with the help of Google Dumps. you will feel proud to become a part of realexamdumps family.

Our success rate from past 5 year very impressive. Our customers are able to build their carrier in IT field.

Owl
Search

45000+ Exams

Buy

Desire Exam

Download

Exam

and pass your exam...

Related Exam

Realexamdumps Providing most updated Cloud Database Engineer Question Answers. Here are a few exams:


Sample Questions

Realexamdumps Providing most updated Cloud Database Engineer Question Answers. Here are a few sample questions:

Google Professional-Cloud-Database-Engineer Sample Question 1

You are a DBA on a Cloud Spanner instance with multiple databases. You need to assign these privileges to all members of the application development team on a specific database:

Can read tables, views, and DDL

Can write rows to the tables

Can add columns and indexes

Cannot drop the database

What should you do?


Options:

A. Assign the Cloud Spanner Database Reader and Cloud Spanner Backup Writer roles.
B. Assign the Cloud Spanner Database Admin role.
C. Assign the Cloud Spanner Database User role.
D. Assign the Cloud Spanner Admin role.

Answer: D

Google Professional-Cloud-Database-Engineer Sample Question 2

Your application follows a microservices architecture and uses a single large Cloud SQL instance, which is starting to have performance issues as your application grows. in the Cloud Monitoring dashboard, the CPU utilization looks normal You want to follow Google-recommended practices to resolve and prevent these performance issues while avoiding any major refactoring. What should you do?


Options:

A. Use Cloud Spanner instead of Cloud SQL.
B. Increase the number of CPUs for your instance.
C. Increase the storage size for the instance.
D. Use many smaller Cloud SQL instances.

Answer: B

Google Professional-Cloud-Database-Engineer Sample Question 3

Your team is building a new inventory management application that will require read and write database instances in multiple Google Cloud regions around the globe. Your database solution requires 99.99% availability and global transactional consistency. You need a fully managed backend relational database to store inventory changes. What should you do?


Options:

A. Use Bigtable.
B. Use Firestore.
C. Use Cloud SQL for MySQL
D. Use Cloud Spanner.

Answer: D

Google Professional-Cloud-Database-Engineer Sample Question 4

Your organization is running a MySQL workload in Cloud SQL. Suddenly you see a degradation in database performance. You need to identify the root cause of the performance degradation. What should you do?


Options:

A. Use Logs Explorer to analyze log data.
B. Use Cloud Monitoring to monitor CPU, memory, and storage utilization metrics.
C. Use Error Reporting to count, analyze, and aggregate the data.
D. Use Cloud Debugger to inspect the state of an application.

Answer: C

Google Professional-Cloud-Database-Engineer Sample Question 5

You plan to use Database Migration Service to migrate data from a PostgreSQL on-premises instance to Cloud SQL. You need to identify the prerequisites for creating and automating the task. What should you do? (Choose two.)


Options:

A. Drop or disable all users except database administration users.
B. Disable all foreign key constraints on the source PostgreSQL database.
C. Ensure that all PostgreSQL tables have a primary key.
D. Shut down the database before the Data Migration Service task is started.
E. Ensure that pglogical is installed on the source PostgreSQL database.

Answer: B, F

Google Professional-Cloud-Database-Engineer Sample Question 6

Your team recently released a new version of a highly consumed application to accommodate additional user traffic. Shortly after the release, you received an alert from your production monitoring team that there is consistently high replication lag between your primary instance and the read replicas of your Cloud SQL for MySQL instances. You need to resolve the replication lag. What should you do?


Options:

A. Identify and optimize slow running queries, or set parallel replication flags.
B. Stop all running queries, and re-create the replicas.
C. Edit the primary instance to upgrade to a larger disk, and increase vCPU count.
D. Edit the primary instance to add additional memory.

Answer: D

Google Professional-Cloud-Database-Engineer Sample Question 7

Your organization has a ticketing system that needs an online marketing analytics and reporting application. You need to select a relational database that can manage hundreds of terabytes of data to support this new application. Which database should you use?


Options:

A. Cloud SQL
B. BigQuery
C. Cloud Spanner
D. Bigtable

Answer: C

Google Professional-Cloud-Database-Engineer Sample Question 8

You want to migrate your PostgreSQL database from another cloud provider to Cloud SQL. You plan on using Database Migration Service and need to assess the impact of any known limitations. What should you do? (Choose two.)


Options:

A. Identify whether the database has over 512 tables.
B. Identify all tables that do not have a primary key.
C. Identity all tables that do not have at least one foreign key.
D. Identify whether the source database is encrypted using pgcrypto extension.
E. Identify whether the source database uses customer-managed encryption keys (CMEK).

Answer: C, F

Google Professional-Cloud-Database-Engineer Sample Question 9

You work in the logistics department. Your data analysis team needs daily extracts from Cloud SQL for MySQL to train a machine learning model. The model will be used to optimize next-day routes. You need to export the data in CSV format. You want to follow Google-recommended practices. What should you do?


Options:

A. Use Cloud Scheduler to trigger a Cloud Function that will run a select * from table(s) query to call the cloudsql.instances.export API.
B. Use Cloud Scheduler to trigger a Cloud Function through Pub/Sub to call the cloudsql.instances.export API.
C. Use Cloud Composer to orchestrate an export by calling the cloudsql.instances.export API.
D. Use Cloud Composer to execute a select * from table(s) query and export results.

Answer: B

Google Professional-Cloud-Database-Engineer Sample Question 10

You work for a large retail and ecommerce company that is starting to extend their business globally. Your company plans to migrate to Google Cloud. You want to use platforms that will scale easily, handle transactions with the least amount of latency, and provide a reliable customer experience. You need a storage layer for sales transactions and current inventory levels. You want to retain the same relational schema that your existing platform uses. What should you do?


Options:

A. Store your data in Firestore in a multi-region location, and place your compute resources in one of the constituent regions.
B. Deploy Cloud Spanner using a multi-region instance, and place your compute resources close to the default leader region.
C. Build an in-memory cache in Memorystore, and deploy to the specific geographic regions where your application resides.
D. Deploy a Bigtable instance with a cluster in one region and a replica cluster in another geographic region.

Answer: C

Google Professional-Cloud-Database-Engineer Sample Question 11

Your company uses Cloud Spanner for a mission-critical inventory management system that is globally available. You recently loaded stock keeping unit (SKU) and product catalog data from a company acquisition and observed hot-spots in the Cloud Spanner database. You want to follow Google-recommended schema design practices to avoid performance degradation. What should you do? (Choose two.)


Options:

A. Use an auto-incrementing value as the primary key.
B. Normalize the data model.
C. Promote low-cardinality attributes in multi-attribute primary keys.
D. Promote high-cardinality attributes in multi-attribute primary keys.
E. Use bit-reverse sequential value as the primary key.

Answer: A, E

Google Professional-Cloud-Database-Engineer Sample Question 12

An analytics team needs to read data out of Cloud SQL for SQL Server and update a table in Cloud Spanner. You need to create a service account and grant least privilege access using predefined roles. What roles should you assign to the service account?


Options:

A. roles/cloudsql.viewer and roles/spanner.databaseUser
B. roles/cloudsql.editor and roles/spanner.admin
C. roles/cloudsql.client and roles/spanner.databaseReader
D. roles/cloudsql.instanceUser and roles/spanner.databaseUser

Answer: D

Google Professional-Cloud-Database-Engineer Sample Question 13

Your company is shutting down their data center and migrating several MySQL and PostgreSQL databases to Google Cloud. Your database operations team is severely constrained by ongoing production releases and the lack of capacity for additional on-premises backups. You want to ensure that the scheduled migrations happen with minimal downtime and that the Google Cloud databases stay in sync with the on-premises data changes until the applications can cut over. What should you do? (Choose two.)


Options:

A. Use Database Migration Service to migrate the databases to Cloud SQL.
B. Use a cross-region read replica to migrate the databases to Cloud SQL.
C. Use replication from an external server to migrate the databases to Cloud SQL.
D. Use an external read replica to migrate the databases to Cloud SQL.
E. Use a read replica to migrate the databases to Cloud SQL.

Answer: C, F

Google Professional-Cloud-Database-Engineer Sample Question 14

You are managing a small Cloud SQL instance for developers to do testing. The instance is not critical and has a recovery point objective (RPO) of several days. You want to minimize ongoing costs for this instance. What should you do?


Options:

A. Take no backups, and turn off transaction log retention.
B. Take one manual backup per day, and turn off transaction log retention.
C. Turn on automated backup, and turn off transaction log retention.
D. Turn on automated backup, and turn on transaction log retention.

Answer: C

Google Professional-Cloud-Database-Engineer Sample Question 15

You need to migrate existing databases from Microsoft SQL Server 2016 Standard Edition on a single Windows Server 2019 Datacenter Edition to a single Cloud SQL for SQL Server instance. During the discovery phase of your project, you notice that your on-premises server peaks at around 25,000 read IOPS. You need to ensure that your Cloud SQL instance is sized appropriately to maximize read performance. What should you do?


Options:

A. Create a SQL Server 2019 Standard on Standard machine type with 4 vCPUs, 15 GB of RAM, and 800 GB of solid-state drive (SSD).
B. Create a SQL Server 2019 Standard on High Memory machine type with at least 16 vCPUs, 104 GB of RAM, and 200 GB of SSD.
C. Create a SQL Server 2019 Standard on High Memory machine type with 16 vCPUs, 104 GB of RAM, and 4 TB of SSD.
D. Create a SQL Server 2019 Enterprise on High Memory machine type with 16 vCPUs, 104 GB of RAM, and 500 GB of SSD.

Answer: C


and so much more...