Labour Day Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 2493360325

Good News !!! CCDAK Confluent Certified Developer for Apache Kafka Certification Examination is now Stable and With Pass Result

CCDAK Practice Exam Questions and Answers

Confluent Certified Developer for Apache Kafka Certification Examination

Last Update 10 hours ago
Total Questions : 150

Confluent Certified Developer for Apache Kafka Certification Examination is stable now with all latest exam questions are added 10 hours ago. Incorporating CCDAK practice exam questions into your study plan is more than just a preparation strategy.

By familiarizing yourself with the Confluent Certified Developer for Apache Kafka Certification Examination exam format, identifying knowledge gaps, applying theoretical knowledge in Confluent practical scenarios, you are setting yourself up for success. CCDAK exam dumps provide a realistic preview, helping you to adapt your preparation strategy accordingly.

CCDAK exam questions often include scenarios and problem-solving exercises that mirror real-world challenges. Working through CCDAK dumps allows you to practice pacing yourself, ensuring that you can complete all Confluent Certified Developer for Apache Kafka Certification Examination exam questions within the allotted time frame without sacrificing accuracy.

CCDAK PDF

CCDAK PDF (Printable)
$48
$119.99

CCDAK Testing Engine

CCDAK PDF (Printable)
$56
$139.99

CCDAK PDF + Testing Engine

CCDAK PDF (Printable)
$70.8
$176.99
Question # 1

A kafka topic has a replication factor of 3 and min.insync.replicas setting of 1. What is the maximum number of brokers that can be down so that a producer with acks=all can still produce to the topic?

Options:

A.  

3

B.  

0

C.  

2

D.  

1

Discussion 0
Question # 2

You are using JDBC source connector to copy data from 2 tables to two Kafka topics. There is one connector created with max.tasks equal to 2 deployed on a cluster of 3 workers. How many tasks are launched?

Options:

A.  

6

B.  

1

C.  

2

D.  

3

Discussion 0
Question # 3

A customer has many consumer applications that process messages from a Kafka topic. Each consumer application can only process 50 MB/s. Your customer wants to achieve a target throughput of 1 GB/s. What is the minimum number of partitions will you suggest to the customer for that particular topic?

Options:

A.  

10

B.  

20

C.  

1

D.  

50

Discussion 0
Question # 4

Select all that applies (select THREE)

Options:

A.  

min.insync.replicas is a producer setting

B.  

acks is a topic setting

C.  

acks is a producer setting

D.  

min.insync.replicas is a topic setting

E.  

min.insync.replicas matters regardless of the values of acks

F.  

min.insync.replicas only matters if acks=all

Discussion 0
Question # 5

A Zookeeper configuration has tickTime of 2000, initLimit of 20 and syncLimit of 5. What's the timeout value for followers to connect to Zookeeper?

Options:

A.  

20 sec

B.  

10 sec

C.  

2000 ms

D.  

40 sec

Discussion 0
Question # 6

To prevent network-induced duplicates when producing to Kafka, I should use

Options:

A.  

max.in.flight.requests.per.connection=1

B.  

enable.idempotence=true

C.  

retries=200000

D.  

batch.size=1

Discussion 0
Question # 7

A consumer wants to read messages from a specific partition of a topic. How can this be achieved?

Options:

A.  

Call subscribe(String topic, int partition) passing the topic and partition number as the arguments

B.  

Call assign() passing a Collection of TopicPartitions as the argument

C.  

Call subscribe() passing TopicPartition as the argument

Discussion 0
Question # 8

In Avro, removing or adding a field that has a default is a __ schema evolution

Options:

A.  

full

B.  

backward

C.  

breaking

D.  

forward

Discussion 0
Question # 9

Once sent to a topic, a message can be modified

Options:

A.  

No

B.  

Yes

Discussion 0
Question # 10

Which actions will trigger partition rebalance for a consumer group? (select three)

Options:

A.  

Increase partitions of a topic

B.  

Remove a broker from the cluster

C.  

Add a new consumer to consumer group

D.  

A consumer in a consumer group shuts down

Add a broker to the cluster

Discussion 0
Question # 11

What client protocol is supported for the schema registry? (select two)

Options:

A.  

HTTP

B.  

HTTPS

C.  

JDBC

D.  

Websocket

E.  

SASL

Discussion 0
Question # 12

Which Kafka CLI should you use to consume from a topic?

Options:

A.  

kafka-console-consumer

B.  

kafka-topics

C.  

kafka-console

D.  

kafka-consumer-groups

Discussion 0
Question # 13

I am producing Avro data on my Kafka cluster that is integrated with the Confluent Schema Registry. After a schema change that is incompatible, I know my data will be rejected. Which component will reject the data?

Options:

A.  

The Confluent Schema Registry

B.  

The Kafka Broker

C.  

The Kafka Producer itself

D.  

Zookeeper

Discussion 0
Question # 14

In the Kafka consumer metrics it is observed that fetch-rate is very high and each fetch is small. What steps will you take to increase throughput?

Options:

A.  

Increase fetch.max.wait

B.  

Increase fetch.max.bytes

C.  

Decrease fetch.max.bytes

D.  

Decrease fetch.min.bytes

E.  

Increase fetch.min.bytes

Discussion 0
Question # 15

Two consumers share the same group.id (consumer group id). Each consumer will

Options:

A.  

Read mutually exclusive offsets blocks on all the partitions

B.  

Read all the data on mutual exclusive partitions

C.  

Read all data from all partitions

Discussion 0
Question # 16

What are the requirements for a Kafka broker to connect to a Zookeeper ensemble? (select two)

Options:

A.  

Unique value for each broker's zookeeper.connect parameter

B.  

Unique values for each broker's broker.id parameter

C.  

All the brokers must share the same broker.id

D.  

All the brokers must share the same zookeeper.connect parameter

Discussion 0
Question # 17

is KSQL ANSI SQL compliant?

Options:

A.  

Yes

B.  

No

Discussion 0
Question # 18

A topic receives all the orders for the products that are available on a commerce site. Two applications want to process all the messages independently - order fulfilment and monitoring. The topic has 4 partitions, how would you organise the consumers for optimal performance and resource usage?

Options:

A.  

Create 8 consumers in the same group with 4 consumers for each application

B.  

Create two consumers groups for two applications with 8 consumers in each

C.  

Create two consumer groups for two applications with 4 consumers in each

D.  

Create four consumers in the same group, one for each partition - two for fulfilment and two for monitoring

Discussion 0
Question # 19

You are sending messages with keys to a topic. To increase throughput, you decide to increase the number of partitions of the topic. Select all that apply.

Options:

A.  

All the existing records will get rebalanced among the partitions to balance load

B.  

New records with the same key will get written to the partition where old records with that key were written

C.  

New records may get written to a different partition

D.  

Old records will stay in their partitions

Discussion 0
Question # 20

What's is true about Kafka brokers and clients from version 0.10.2 onwards?

Options:

A.  

Clients and brokers must have the exact same version to be able to communicate

B.  

A newer client can talk to a newer broker, but an older client cannot talk to a newer broker

C.  

A newer client can talk to a newer broker, and an older client can talk to a newer broker

D.  

A newer client can't talk to a newer broker, but an older client can talk to a newer broker

Discussion 0
Question # 21

If I want to have an extremely high confidence that leaders and replicas have my data, I should use

Options:

A.  

acks=all, replication factor=2, min.insync.replicas=1

B.  

acks=1, replication factor=3, min.insync.replicas=2

C.  

acks=all, replication factor=3, min.insync.replicas=2

D.  

acks=all, replication factor=3, min.insync.replicas=1

Discussion 0
Question # 22

In Java, Avro SpecificRecords classes are

Options:

A.  

automatically generated from an Avro Schema

B.  

written manually by the programmer

C.  

automatically generated from an Avro Schema + a Maven / Gradle Plugin

Discussion 0
Get CCDAK dumps and pass your exam in 24 hours!

Free Exams Sample Questions