Google Professional-Cloud-Architect Dumps - Google Certified Professional - Cloud Architect (GCP) PDF Sample Questions

discount banner
Exam Code:
Professional-Cloud-Architect
Exam Name:
Google Certified Professional - Cloud Architect (GCP)
275 Questions
Last Update Date : 20 May, 2024
PDF + Test Engine
$60 $78
Test Engine Only Demo
$50 $65
PDF Only Demo
$35 $45.5

Google Professional-Cloud-Architect This Week Result

0

They can't be wrong

0

Score in Real Exam at Testing Centre

0

Questions came word by word from this dumps

Best Google Professional-Cloud-Architect Dumps - pass your exam In First Attempt

Our Professional-Cloud-Architect dumps are better than all other cheap Professional-Cloud-Architect study material.

Only best way to pass your Google Professional-Cloud-Architect is that if you will get reliable exam study materials. We ensure you that realexamdumps is one of the most authentic website for Google Google Cloud Certified exam question answers. Pass your Professional-Cloud-Architect Google Certified Professional - Cloud Architect (GCP) with full confidence. You can get free Google Certified Professional - Cloud Architect (GCP) demo from realexamdumps. We ensure 100% your success in Professional-Cloud-Architect Exam with the help of Google Dumps. you will feel proud to become a part of realexamdumps family.

Our success rate from past 5 year very impressive. Our customers are able to build their carrier in IT field.

Owl
Search

45000+ Exams

Buy

Desire Exam

Download

Exam

and pass your exam...

Related Exam

Realexamdumps Providing most updated Google Cloud Certified Question Answers. Here are a few exams:


Sample Questions

Realexamdumps Providing most updated Google Cloud Certified Question Answers. Here are a few sample questions:

Google Professional-Cloud-Architect Sample Question 1

For this question, refer to the Mountkirk Games case study.

Mountkirk Games has deployed their new backend on Google Cloud Platform (GCP). You want to create a thorough testing process for new versions of the backend before they are released to the public. You want the testing environment to scale in an economical way. How should you design the process?


Options:

A. Create a scalable environment in GCP for simulating production load.
B. Use the existing infrastructure to test the GCP-based backend at scale.
C. Build stress tests into each component of your application using resources internal to GCP to simulate load.
D. Create a set of static environments in GCP to test different levels of load — for example, high, medium, and low.

Answer: A Explanation: Explanation: From scenario: Requirements for Game Backend PlatformDynamically scale up or down based on game activityConnect to a managed NoSQL database serviceRun customize Linux distrp

Google Professional-Cloud-Architect Sample Question 2

For this question, refer to the Mountkirk Games case study.

Mountkirk Games wants to set up a continuous delivery pipeline. Their architecture includes many small services that they want to be able to update and roll back quickly. Mountkirk Games has the following requirements:

• Services are deployed redundantly across multiple regions in the US and Europe.

• Only frontend services are exposed on the public internet.

• They can provide a single frontend IP for their fleet of services.

• Deployment artifacts are immutable.

Which set of products should they use?


Options:

A. Google Cloud Storage, Google Cloud Dataflow, Google Compute Engine
B. Google Cloud Storage, Google App Engine, Google Network Load Balancer
C. Google Kubernetes Registry, Google Container Engine, Google HTTP(S) Load Balancer
D. Google Cloud Functions, Google Cloud Pub/Sub, Google Cloud Deployment Manager

Answer: D

Google Professional-Cloud-Architect Sample Question 3

For this question, refer to the Dress4Win case study.

The Dress4Win security team has disabled external SSH access into production virtual machines (VMs) on Google Cloud Platform (GCP). The operations team needs to remotely manage the VMs, build and push Docker containers, and manage Google Cloud Storage objects. What can they do?


Options:

A. Grant the operations engineers access to use Google Cloud Shell.
B. Configure a VPN connection to GCP to allow SSH access to the cloud VMs.
C. Develop a new access request process that grants temporary SSH access to cloud VMs when an operations engineer needs to perform a task.
D. Have the development team build an API service that allows the operations team to execute specific remote procedure calls to accomplish their tasks.

Answer: B

Google Professional-Cloud-Architect Sample Question 4

Dress4win has end to end tests covering 100% of their endpoints.

They want to ensure that the move of cloud does not introduce any new bugs.

Which additional testing methods should the developers employ to prevent an outage?


Options:

A. They should run the end to end tests in the cloud staging environment to determine if the code is working asintended.
B. They should enable google stack driver debugger on the application code to show errors in the code
C. They should add additional unit tests and production scale load tests on their cloud staging environment.
D. They should add canary tests so developers can measure how much of an impact the new release causes to latency

Answer: C

Google Professional-Cloud-Architect Sample Question 5

For this question, refer to the TerramEarth case study.

TerramEarth plans to connect all 20 million vehicles in the field to the cloud. This increases the volume to 20 million 600 byte records a second for 40 TB an hour. How should you design the data ingestion?


Options:

A. Vehicles write data directly to GCS.
B. Vehicles write data directly to Google Cloud Pub/Sub.
C. Vehicles stream data directly to Google BigQuery.
D. Vehicles continue to write data using the existing system (FTP).

Answer: B Explanation: Explanation: https://cloud.google.com/solutions/data-lifecycle-cloud-platform https://cloud.google.com/solutions/designing-connected-vehicle-platforn

Google Professional-Cloud-Architect Sample Question 6

For this question, refer to the Dress4Win case study.

Dress4Win would like to become familiar with deploying applications to the cloud by successfully deploying some applications quickly, as is. They have asked for your recommendation. What should you advise?


Options:

A. Identify self-contained applications with external dependencies as a first move to the cloud.
B. Identify enterprise applications with internal dependencies and recommend these as a first move to the cloud.
C. Suggest moving their in-house databases to the cloud and continue serving requests to on-premise applications.
D. Recommend moving their message queuing servers to the cloud and continue handling requests to on-premise applications.

Answer: A Explanation: Explanation: https://cloud.google. com/blog/products/gcp/the-five-phases-of-migrating-to-google-cloud-platforn

Google Professional-Cloud-Architect Sample Question 7

For this question, refer to the Helicopter Racing League (HRL) case study. HRL wants better prediction

accuracy from their ML prediction models. They want you to use Google’s AI Platform so HRL can understand

and interpret the predictions. What should you do?


Options:

A. Use Explainable AI.
B. Use Vision AI.
C. Use Google Cloud’s operations suite.
D. Use Jupyter Notebooks.

Answer: A Explanation: Reference: [Reference: https://cloud.google.com/ai-platform/prediction/docs/ai-explanations/preparing-metadata, ]

Google Professional-Cloud-Architect Sample Question 8

For this question, refer to the Helicopter Racing League (HRL) case study. A recent finance audit of cloud

infrastructure noted an exceptionally high number of Compute Engine instances are allocated to do video

encoding and transcoding. You suspect that these Virtual Machines are zombie machines that were not deleted

after their workloads completed. You need to quickly get a list of which VM instances are idle. What should you

do?


Options:

A. Log into each Compute Engine instance and collect disk, CPU, memory, and network usage statistics foranalysis.
B. Use the gcloud compute instances list to list the virtual machine instances that have the idle: true label set.
C. Use the gcloud recommender command to list the idle virtual machine instances.
D. From the Google Console, identify which Compute Engine instances in the managed instance groups areno longer responding to health check probes.

Answer: C Explanation: Reference: [Reference: https://cloud.google.com/compute/docs/instances/viewing-and-applying-idle-vm-recommendations, ]

Google Professional-Cloud-Architect Sample Question 9

For this question, refer to the Helicopter Racing League (HRL) case study. The HRL development team

releases a new version of their predictive capability application every Tuesday evening at 3 a.m. UTC to a

repository. The security team at HRL has developed an in-house penetration test Cloud Function called Airwolf.

The security team wants to run Airwolf against the predictive capability application as soon as it is released

every Tuesday. You need to set up Airwolf to run at the recurring weekly cadence. What should you do?


Options:

A. Set up Cloud Tasks and a Cloud Storage bucket that triggers a Cloud Function.
B. Set up a Cloud Logging sink and a Cloud Storage bucket that triggers a Cloud Function.
C. Configure the deployment job to notify a Pub/Sub queue that triggers a Cloud Function.
D. Set up Identity and Access Management (IAM) and Confidential Computing to trigger a Cloud Function.

Answer: B

Google Professional-Cloud-Architect Sample Question 10

For this question, refer to the Dress4Win case study. To be legally compliant during an audit, Dress4Win must be able to give insights in all administrative actions that modify the configuration or metadata of resources on Google Cloud.

What should you do?


Options:

A. Use Stackdriver Trace to create a trace list analysis.
B. Use Stackdriver Monitoring to create a dashboard on the project’s activity.
C. Enable Cloud Identity-Aware Proxy in all projects, and add the group of Administrators as a member.
D. Use the Activity page in the GCP Console and Stackdriver Logging to provide the required insight.

Answer: A Explanation: Explanation: https://cloud.google.com/logging/docs/audit/

Google Professional-Cloud-Architect Sample Question 11

For this question, refer to the Dress4Win case study. You want to ensure that your on-premises architecture meets business requirements before you migrate your solution.

What change in the on-premises architecture should you make?


Options:

A. Replace RabbitMQ with Google Pub/Sub.
B. Downgrade MySQL to v5.7, which is supported by Cloud SQL for MySQL.
C. Resize compute resources to match predefined Compute Engine machine types.
D. Containerize the micro services and host them in Google Kubernetes Engine.

Answer: D

Google Professional-Cloud-Architect Sample Question 12

For this question, refer to the Dress4Win case study. Considering the given business requirements, how would you automate the deployment of web and transactional data layers?


Options:

A. Deploy Nginx and Tomcat using Cloud Deployment Manager to Compute Engine. Deploy a Cloud SQL server to replace MySQL. Deploy Jenkins using Cloud Deployment Manager.
B. Deploy Nginx and Tomcat using Cloud Launcher. Deploy a MySQL server using Cloud Launcher. Deploy Jenkins to Compute Engine using Cloud Deployment Manager scripts.
C. Migrate Nginx and Tomcat to App Engine. Deploy a Cloud Datastore server to replace the MySQL server in a high-availability configuration. Deploy Jenkins to Compute Engine using Cloud Launcher.
D. Migrate Nginx and Tomcat to App Engine. Deploy a MySQL server using Cloud Launcher. Deploy Jenkins to Compute Engine using Cloud Launcher.

Answer: B

Google Professional-Cloud-Architect Sample Question 13

For this question, refer to the Dress4Win case study. You are responsible for the security of data stored in

Cloud Storage for your company, Dress4Win. You have already created a set of Google Groups and assigned the appropriate users to those groups. You should use Google best practices and implement the simplest design to meet the requirements.

Considering Dress4Win’s business and technical requirements, what should you do?


Options:

A. Assign custom IAM roles to the Google Groups you created in order to enforce security requirements.Encrypt data with a customer-supplied encryption key when storing files in Cloud Storage.
B. Assign custom IAM roles to the Google Groups you created in order to enforce security requirements.Enable default storage encryption before storing files in Cloud Storage.
C. Assign predefined IAM roles to the Google Groups you created in order to enforce security requirements.Utilize Google’s default encryption at rest when storing files in Cloud Storage.
D. Assign predefined IAM roles to the Google Groups you created in order to enforce security requirements. Ensure that the default Cloud KMS key is set before storing files in Cloud Storage.

Answer: D Explanation: Explanation: https://cloud.google.com/iam/docs/understanding-service-accountt

Google Professional-Cloud-Architect Sample Question 14

You need to develop procedures to verify resilience of disaster recovery for remote recovery using GCP. Your production environment is hosted on-premises. You need to establish a secure, redundant connection between your on premises network and the GCP network.

What should you do?


Options:

A. Verify that Dedicated Interconnect can replicate files to GCP. Verify that direct peering can establish asecure connection between your networks if Dedicated Interconnect fails.
B. Verify that Dedicated Interconnect can replicate files to GCP. Verify that Cloud VPN can establish a secure connection between your networks if Dedicated Interconnect fails.
C. Verify that the Transfer Appliance can replicate files to GCP. Verify that direct peering can establish asecure connection between your networks if the Transfer Appliance fails.
D. Verify that the Transfer Appliance can replicate files to GCP. Verify that Cloud VPN can establish a secure connection between your networks if the Transfer Appliance fails.

Answer: B Explanation: Explanation: https://cloud.google.com/interconnect/docs/how-to/direct-peerinh

Google Professional-Cloud-Architect Sample Question 15

You are running a cluster on Kubernetes Engine to serve a web application. Users are reporting that a specific part of the application is not responding anymore. You notice that all pods of your deployment keep restarting after 2 seconds. The application writes logs to standard output. You want to inspect the logs to find the cause of the issue. Which approach can you take?


Options:

A. Review the Stackdriver logs for each Compute Engine instance that is serving as a node in the cluster.
B. Review the Stackdriver logs for the specific Kubernetes Engine container that is serving the unresponsive part of the application.
C. Connect to the cluster using gcloud credentials and connect to a container in one of the pods to read the logs.
D. Review the Serial Port logs for each Compute Engine instance that is serving as a node in the cluster.

Answer: C

Google Professional-Cloud-Architect Sample Question 16

Your customer runs a web service used by e-commerce sites to offer product recommendations to users. The company has begun experimenting with a machine learning model on Google Cloud Platform to improve the quality of results.

What should the customer do to improve their model’s results over time?


Options:

A. Export Cloud Machine Learning Engine performance metrics from Stackdriver to BigQuery, to be used to analyze the efficiency of the model.
B. Build a roadmap to move the machine learning model training from Cloud GPUs to Cloud TPUs, whichoffer better results.
C. Monitor Compute Engine announcements for availability of newer CPU architectures, and deploy the model to them as soon as they are available for additional performance.
D. Save a history of recommendations and results of the recommendations in BigQuery, to be used as training data.

Answer: D Explanation: Explanation: https://cloud.google.c om/solutions/building-a-serverless-ml-modem

Google Professional-Cloud-Architect Sample Question 17

Your organization has stored sensitive data in a Cloud Storage bucket. For regulatory reasons, your company must be able to rotate the encryption key used to encrypt the data in the bucket. The data will be processed in Dataproc. You want to follow Google-recommended practices for security What should you do?


Options:

A. Create a key with Cloud Key Management Service (KMS) Encrypt the data using the encrypt method of Cloud KMS.
B. Create a key with Cloud Key Management Service (KMS). Set the encryption key on the bucket to the Cloud KMS key.
C. Generate a GPG key pair. Encrypt the data using the GPG key. Upload the encrypted data to the bucket.
D. Generate an AES-256 encryption key. Encrypt the data in the bucket using the customer-supplied encryption keys feature.

Answer: B Explanation: Explanation: https://cloud.google.com/storage/docs/encryption/using-customer-managed-keys#add-object-kez

Google Professional-Cloud-Architect Sample Question 18

For this question, refer to the EHR Healthcare case study. You need to define the technical architecture for securely deploying workloads to Google Cloud. You also need to ensure that only verified containers are deployed using Google Cloud services. What should you do? (Choose two.)


Options:

A. Enable Binary Authorization on GKE, and sign containers as part of a CI/CD pipeline.
B. Configure Jenkins to utilize Kritis to cryptographically sign a container as part of a CI/CD pipeline.
C. Configure Container Registry to only allow trusted service accounts to create and deploy containers from the registry.
D. Configure Container Registry to use vulnerability scanning to confirm that there are no vulnerabilities before deploying the workload.

Answer: A, D Explanation: Explanation: Binary Authorization to ensure only verified containers are deployed To ensure deployment are secure and and consistent, automatically scan images for vulnerabilities with container analysis (https://cloud.google.com/docs/ci-cd/overview?hl=en &skip_cache=true)

Google Professional-Cloud-Architect Sample Question 19

You need to optimize batch file transfers into Cloud Storage for Mountkirk Games’ new Google Cloud solution.

The batch files contain game statistics that need to be staged in Cloud Storage and be processed by an extract

transform load (ETL) tool. What should you do?


Options:

A. Use gsutil to batch move files in sequence.
B. Use gsutil to batch copy the files in parallel.
C. Use gsutil to extract the files as the first part of ETL.
D. Use gsutil to load the files as the last part of ETL.

Answer: B Explanation: Reference: [Reference: https://cloud.google.com/storage/docs/gsutil/commands/cp, , ]

Google Professional-Cloud-Architect Sample Question 20

You need to implement a network ingress for a new game that meets the defined business and technical

requirements. Mountkirk Games wants each regional game instance to be located in multiple Google Cloud

regions. What should you do?


Options:

A. Configure a global load balancer connected to a managed instance group running Compute Engineinstances.
B. Configure kubemci with a global load balancer and Google Kubernetes Engine.
C. Configure a global load balancer with Google Kubernetes Engine.
D. Configure Ingress for Anthos with a global load balancer and Google Kubernetes Engine.

Answer: B

Google Professional-Cloud-Architect Sample Question 21

For this question, refer to the EHR Healthcare case study. You are a developer on the EHR customer portal team. Your team recently migrated the customer portal application to Google Cloud. The load has increased on the application servers, and now the application is logging many timeout errors. You recently incorporated Pub/Sub into the application architecture, and the application is not logging any Pub/Sub publishing errors. You want to improve publishing latency. What should you do?


Options:

A. Increase the Pub/Sub Total Timeout retry value.
B. Move from a Pub/Sub subscriber pull model to a push model.
C. Turn off Pub/Sub message batching.
D. Create a backup Pub/Sub message queue.

Answer: C Explanation: Explanation: https://cloud.google.com/pubsub/docs/publisher?hl=en#batchinh

Google Professional-Cloud-Architect Sample Question 22

For this question, refer to the Mountkirk Games case study. You need to analyze and define the technical architecture for the database workloads for your company, Mountkirk Games. Considering the business and technical requirements, what should you do?


Options:

A. Use Cloud SQL for time series data, and use Cloud Bigtable for historical data queries.
B. Use Cloud SQL to replace MySQL, and use Cloud Spanner for historical data queries.
C. Use Cloud Bigtable to replace MySQL, and use BigQuery for historical data queries.
D. Use Cloud Bigtable for time series data, use Cloud Spanner for transactional data, and use BigQuery for historical data queries.

Answer: D Explanation: Explanation: https://cloud.google.com/bigtable/docs/schema-design-time-seriet

Google Professional-Cloud-Architect Sample Question 23

For this question, refer to the Mountkirk Games case study. Mountkirk Games wants to design their solution for the future in order to take advantage of cloud and technology improvements as they become available. Which two steps should they take? (Choose two.)


Options:

A. Store as much analytics and game activity data as financially feasible today so it can be used to train machine learning models to predict user behavior in the future.
B. Begin packaging their game backend artifacts in container images and running them on Kubernetes Engine to improve the availability to scale up or down based on game activity.
C. Set up a CI/CD pipeline using Jenkins and Spinnaker to automate canary deployments and improve development velocity.
D. Adopt a schema versioning tool to reduce downtime when adding new game features that require storing additional player data in the database.
E. Implement a weekly rolling maintenance process for the Linux virtual machines so they can apply critical kernel patches and package updates and reduce the risk of 0-day vulnerabilities.

Answer: B, D

Google Professional-Cloud-Architect Sample Question 24

For this question, refer to the TerramEarth case study. A new architecture that writes all incoming data to

BigQuery has been introduced. You notice that the data is dirty, and want to ensure data quality on an

automated daily basis while managing cost.

What should you do?


Options:

A. Set up a streaming Cloud Dataflow job, receiving data by the ingestion process. Clean the data in a Cloud Dataflow pipeline.
B. Create a Cloud Function that reads data from BigQuery and cleans it. Trigger it. Trigger the Cloud Function from a Compute Engine instance.
C. Create a SQL statement on the data in BigQuery, and save it as a view. Run the view daily, and save the result to a new table.
D. Use Cloud Dataprep and configure the BigQuery tables as the source. Schedule a daily job to clean the data.

Answer: B

Google Professional-Cloud-Architect Sample Question 25

For this question, refer to the TerramEarth case study. To be compliant with European GDPR regulation, TerramEarth is required to delete data generated from its European customers after a period of 36 months when it contains personal data. In the new architecture, this data will be stored in both Cloud Storage and BigQuery. What should you do?


Options:

A. Create a BigQuery table for the European data, and set the table retention period to 36 months. For Cloud Storage, use gsutil to enable lifecycle management using a DELETE action with an Age condition of 36 months.
B. Create a BigQuery table for the European data, and set the table retention period to 36 months. For Cloud Storage, use gsutil to create a SetStorageClass to NONE action when with an Age condition of 36 months.
C. Create a BigQuery time-partitioned table for the European data, and set the partition expiration period to 36 months. For Cloud Storage, use gsutil to enable lifecycle management using a DELETE action with an Age condition of 36 months.
D. Create a BigQuery time-partitioned table for the European data, and set the partition period to 36 months. For Cloud Storage, use gsutil to create a SetStorageClass to NONE action with an Age condition of 36 months.

Answer: C Explanation: Explanation: https://cloud.google.com/bigquery/docs/managing-partitioned-tables#partition-expiration https://cloud.google.com/storage/docs/lifecyclf

Google Professional-Cloud-Architect Sample Question 26

For this question, refer to the JencoMart case study.

JencoMart wants to move their User Profiles database to Google Cloud Platform. Which Google Database should they use?


Options:

A. Cloud Spanner
B. Google BigQuery
C. Google Cloud SQL
D. Google Cloud Datastore

Answer: D Explanation: Explanation: https://cloud.google.com/datastore/docs/concepts/overviewCommon workloads for Google Cloud Datastore:User profilesProduct catalogsGame stateReferences: https://cloud.google.com/s torage-options/https://cloud.google.com/datastore/docs/concepts/overviex


and so much more...