Pre-Summer Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 65pass65

Professional-Cloud-Security-Engineer Google Cloud Certified - Professional Cloud Security Engineer is now Stable and With Pass Result | Test Your Knowledge for Free

Exams4sure Dumps

Professional-Cloud-Security-Engineer Practice Questions

Google Cloud Certified - Professional Cloud Security Engineer

Last Update 22 hours ago
Total Questions : 318

Dive into our fully updated and stable Professional-Cloud-Security-Engineer practice test platform, featuring all the latest Google Cloud Certified exam questions added this week. Our preparation tool is more than just a Google study aid; it's a strategic advantage.

Our free Google Cloud Certified practice questions crafted to reflect the domains and difficulty of the actual exam. The detailed rationales explain the 'why' behind each answer, reinforcing key concepts about Professional-Cloud-Security-Engineer. Use this test to pinpoint which areas you need to focus your study on.

Professional-Cloud-Security-Engineer PDF

Professional-Cloud-Security-Engineer PDF (Printable)
$43.75
$124.99

Professional-Cloud-Security-Engineer Testing Engine

Professional-Cloud-Security-Engineer PDF (Printable)
$50.75
$144.99

Professional-Cloud-Security-Engineer PDF + Testing Engine

Professional-Cloud-Security-Engineer PDF (Printable)
$63.7
$181.99
Question # 31

You need to implement an encryption-at-rest strategy that protects sensitive data and reduces key management complexity for non-sensitive data. Your solution has the following requirements:

Schedule key rotation for sensitive data.

Control which region the encryption keys for sensitive data are stored in.

Minimize the latency to access encryption keys for both sensitive and non-sensitive data.

What should you do?

Options:

A.  

Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.

B.  

Encrypt non-sensitive data and sensitive data with Cloud Key Management Service.

C.  

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.

D.  

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.

Discussion 0
Question # 32

You are responsible for managing identities in your company's Google Cloud organization. Employees are frequently using your organization's corporate domain name to create unmanaged Google accounts. You want to implement a practical and efficient solution to prevent employees from completing this action in the future. What should you do?

Options:

A.  

Implement an automated process that scans all identities in your organization and disables any unmanaged accounts.

B.  

Create a Google Cloud identity for all users in your organization. Ensure that new users are added automatically.

C.  

Register a new domain for your Google Cloud resources. Move all existing identities and resources to this domain.

D.  

Switch your corporate email system to another domain to avoid using the same domain for Google Cloud identities and corporate emails.

Discussion 0
Question # 33

Your company has deployed an artificial intelligence model in a central project. This model has a lot of sensitive intellectual property and must be kept strictly isolated from the internet. You must expose the model endpoint only to a defined list of projects in your organization. What should you do?

Options:

A.  

Within the model project, create an external Application Load Balancer that points to the model endpoint. Create a Cloud Armor policy to restrict IP addresses to Google Cloud.

B.  

Within the model project, create an internal Application Load Balancer that points to the model endpoint. Expose this load balancer with Private Service Connect to a configured list of projects.

B.  

Activate Private Google Access in both the model project and in each project that needs to connect to the model. Create a firewall policy to allow connectivity to Private Google Access addresses.

C.  

Create a central project to host Shared VPC networks that are provided to all other projects. Centrally administer all firewall rules in this project to grant access to the model.

Discussion 0
Question # 34

You are implementing a new web application on Google Cloud that will be accessed from your on-premises network. To provide protection from threats like malware, you must implement transport layer security (TLS) interception for incoming traffic to your application. What should you do?​

Options:

A.  

Configure Secure Web Proxy. Offload the TLS traffic in the load balancer, inspect the traffic, and forward the traffic to the web application.​

B.  

Configure an internal proxy load balancer. Offload the TLS traffic in the load balancer, inspect the traffic, and forward the traffic to the web application.​

C.  

Configure a hierarchical firewall policy. Enable TLS interception by using Cloud Next Generation Firewall (NGFW) Enterprise.​

D.  

Configure a VPC firewall rule. Enable TLS interception by using Cloud Next Generation Firewall (NGFW) Enterprise.​

Discussion 0
Question # 35

You want data on Compute Engine disks to be encrypted at rest with keys managed by Cloud Key Management Service (KMS). Cloud Identity and Access Management (IAM) permissions to these keys must be managed in a grouped way because the permissions should be the same for all keys.

What should you do?

Options:

A.  

Create a single KeyRing for all persistent disks and all Keys in this KeyRing. Manage the IAM permissions at the Key level.

B.  

Create a single KeyRing for all persistent disks and all Keys in this KeyRing. Manage the IAM permissions at the KeyRing level.

C.  

Create a KeyRing per persistent disk, with each KeyRing containing a single Key. Manage the IAM permissions at the Key level.

D.  

Create a KeyRing per persistent disk, with each KeyRing containing a single Key. Manage the IAM permissions at the KeyRing level.

Discussion 0
Question # 36

As adoption of the Cloud Data Loss Prevention (DLP) API grows within the company, you need to optimize usage to reduce cost. DLP target data is stored in Cloud Storage and BigQuery. The location and region are identified as a suffix in the resource name.

Which cost reduction options should you recommend?

Options:

A.  

Set appropriate rowsLimit value on BigQuery data hosted outside the US and set appropriate bytesLimitPerFile value on multiregional Cloud Storage buckets.

B.  

Set appropriate rowsLimit value on BigQuery data hosted outside the US, and minimize transformation units on multiregional Cloud Storage buckets.

C.  

Use rowsLimit and bytesLimitPerFile to sample data and use CloudStorageRegexFileSet to limit scans.

D.  

Use FindingLimits and TimespanContfig to sample data and minimize transformation units.

Discussion 0
Question # 37

Your team uses a service account to authenticate data transfers from a given Compute Engine virtual machine instance of to a specified Cloud Storage bucket. An engineer accidentally deletes the service account, which breaks application functionality. You want to recover the application as quickly as possible without compromising security.

What should you do?

Options:

A.  

Temporarily disable authentication on the Cloud Storage bucket.

B.  

Use the undelete command to recover the deleted service account.

C.  

Create a new service account with the same name as the deleted service account.

D.  

Update the permissions of another existing service account and supply those credentials to the applications.

Discussion 0
Question # 38

You are developing a new application that uses exclusively Compute Engine VMs Once a day. this application will execute five different batch jobs Each of the batch jobs requires a dedicated set of permissions on Google Cloud resources outside of your application. You need to design a secure access concept for the batch jobs that adheres to the least-privilege principle

What should you do?

Options:

A.  

1. Create a general service account **g-sa" to execute the batch jobs.• 2 Grant the permissions required to execute the batch jobs to g-sa.• 3. Execute the batch jobs with the permissions granted to g-sa

B.  

1. Create a general service account "g-sa" to orchestrate the batch jobs.• 2. Create one service account per batch job Mb-sa-[1-5]," and grant only the permissions required to run the individual batch jobs to the service accounts.• 3. Grant the Service Account Token Creator role to g-sa Use g-sa to obtain short-lived access tokens for b-sa-[1-5] and to execute the batch jobs with the permissions of b-sa-[1-5].

C.  

1. Create a workload identity pool and configure workload identity pool providers for each batch job• 2 Assign the workload identity user role to each of the identities configured in the providers.• 3. Create one service account per batch job Mb-sa-[1-5]". and grant only the permissions required to run the individual batch jobs to the service accounts• 4 Generate credential configuration files for each of the providers Use these files to ex

D.  

• 1. Create a general service account "g-sa" to orchestrate the batch jobs.• 2 Create one service account per batch job 'b-sa-[1-5)\ Grant only the permissions required to run the individual batch jobs to the service accounts and generate service account keys for each of these service accounts• 3. Store the service account keys in Secret Manager. Grant g-sa access to Secret Manager and run the batch jobs with the permissions of b-sa-[1-5].<

Discussion 0
Question # 39

A company is deploying their application on Google Cloud Platform. Company policy requires long-term data to be stored using a solution that can automatically replicate data over at least two geographic places.

Which Storage solution are they allowed to use?

Options:

A.  

Cloud Bigtable

B.  

Cloud BigQuery

C.  

Compute Engine SSD Disk

D.  

Compute Engine Persistent Disk

Discussion 0
Question # 40

You manage one of your organization's Google Cloud projects (Project A). AVPC Service Control (SC) perimeter is blocking API access requests to this project including Pub/Sub. A resource running under a service account in another project (Project B) needs to collect messages from a Pub/Sub topic in your project Project B is not included in a VPC SC perimeter. You need to provide access from Project B to the Pub/Sub topic in Project A using the principle of least

Privilege.

What should you do?

Options:

A.  

Configure an ingress policy for the perimeter in Project A and allow access for the service account in Project B to collect messages.

B.  

Create an access level that allows a developer in Project B to subscribe to the Pub/Sub topic that is located in Project

A.  

C.  

Create a perimeter bridge between Project A and Project B to allow the required communication between both projects.

D.  

Remove the Pub/Sub API from the list of restricted services in the perimeter configuration for Project

A.  

Discussion 0
Get Professional-Cloud-Security-Engineer dumps and pass your exam in 24 hours!

Free Exams Sample Questions