최신 Associate-Developer-Apache-Spark 무료덤프 - Databricks Certified Associate Developer for Apache Spark 3.0
Which of the following options describes the responsibility of the executors in Spark?
정답: E
설명: (DumpTOP 회원만 볼 수 있음)
The code block displayed below contains an error. The code block should return a DataFrame in which column predErrorAdded contains the results of Python function add_2_if_geq_3 as applied to numeric and nullable column predError in DataFrame transactionsDf. Find the error.
Code block:
1.def add_2_if_geq_3(x):
2. if x is None:
3. return x
4. elif x >= 3:
5. return x+2
6. return x
7.
8.add_2_if_geq_3_udf = udf(add_2_if_geq_3)
9.
10.transactionsDf.withColumnRenamed("predErrorAdded", add_2_if_geq_3_udf(col("predError")))
Code block:
1.def add_2_if_geq_3(x):
2. if x is None:
3. return x
4. elif x >= 3:
5. return x+2
6. return x
7.
8.add_2_if_geq_3_udf = udf(add_2_if_geq_3)
9.
10.transactionsDf.withColumnRenamed("predErrorAdded", add_2_if_geq_3_udf(col("predError")))
정답: B
설명: (DumpTOP 회원만 볼 수 있음)
Which of the following describes a difference between Spark's cluster and client execution modes?
정답: B
설명: (DumpTOP 회원만 볼 수 있음)
Which of the following code blocks returns a DataFrame that matches the multi-column DataFrame itemsDf, except that integer column itemId has been converted into a string column?
정답: B
설명: (DumpTOP 회원만 볼 수 있음)
Which of the following code blocks uses a schema fileSchema to read a parquet file at location filePath into a DataFrame?
정답: A
설명: (DumpTOP 회원만 볼 수 있음)
Which of the following code blocks can be used to save DataFrame transactionsDf to memory only, recalculating partitions that do not fit in memory when they are needed?
정답: F
설명: (DumpTOP 회원만 볼 수 있음)
The code block shown below should read all files with the file ending .png in directory path into Spark.
Choose the answer that correctly fills the blanks in the code block to accomplish this.
spark.__1__.__2__(__3__).option(__4__, "*.png").__5__(path)
Choose the answer that correctly fills the blanks in the code block to accomplish this.
spark.__1__.__2__(__3__).option(__4__, "*.png").__5__(path)
정답: E
설명: (DumpTOP 회원만 볼 수 있음)
Which of the following code blocks adds a column predErrorSqrt to DataFrame transactionsDf that is the square root of column predError?
정답: C
설명: (DumpTOP 회원만 볼 수 있음)
Which of the following code blocks returns all unique values of column storeId in DataFrame transactionsDf?
정답: C
설명: (DumpTOP 회원만 볼 수 있음)
Which of the following describes the characteristics of accumulators?
정답: A
설명: (DumpTOP 회원만 볼 수 있음)