关于“Redacting Confidential Data within your Pipelines in Cloud Data Fusion”的评价

评论

Daniel M. · 评论11 months之前

Shivam T. · 评论11 months之前

ADITYA K. · 评论11 months之前

Dhyéy S. · 评论11 months之前

Jalina H. · 评论11 months之前

Suyash b. · 评论11 months之前

Souradeep M. · 评论11 months之前

Souradeep M. · 评论11 months之前

Chandan P. · 评论11 months之前

Pragnesh S. · 评论11 months之前

Complete waste of 1.5hrs, tedious long lab and most time spent waiting for service to prvisioned and each job run takes more than 10 mins, then job won't run properly when pipeline preview ran fine 01/09/2025 23:10:01 ERROR Aborting task 01/09/2025 23:10:01 INFO Successfully repaired 'gs://qwiklabs-gcp-00-e1df31d1f801/2025-01-09-15-01/_temporary/0/_temporary/' directory. 01/09/2025 23:10:01 ERROR Task attempt_20250109150913877997772046728130_0005_r_000000_0 aborted. 01/09/2025 23:10:02 ERROR Exception in task 0.0 in stage 0.0 (TID 0) 01/09/2025 23:10:02 WARN Lost task 0.0 in stage 0.0 (TID 0) (cdap-testpipel-ad1cdc98-ce9a-11ef-90b8-2eb8c6a7bfbd-w-0.us-east1-d.c.qwiklabs-gcp-00-e1df31d1f801.internal executor 2): org.apache.spark.SparkException: Task failed while writing rows at org.apache.spark.internal.io.SparkHadoopWriter$.executeTask(SparkHadoopWriter.scala:163) at org.apache.spark.internal.io.SparkHadoopWriter$.$anonfun$write$1(SparkHadoopWriter.scala:88) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) at org.apache.spark.scheduler.Task.run(Task.scala:136) at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1505) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: io.cdap.cdap.api.data.format.UnexpectedFormatException: field State cannot be set to a null value. at io.cdap.cdap.api.data.format.StructuredRecord$Builder.validateAndGetField(StructuredRecord.java:708) at io.cdap.cdap.api.data.format.StructuredRecord$Builder.set(StructuredRecord.java:386) at io.cdap.plugin.format.delimited.common.DelimitedStructuredRecordStringConverter.parseAndSetFieldValue(DelimitedStructuredRecordStringConverter.java:40) at io.cdap.plugin.format.delimited.input.PathTrackingDelimitedInputFormat$1.nextKeyValue(PathTrackingDelimitedInputFormat.java:95) at io.cdap.plugin.format.input.PathTrackingInputFormat$TrackingRecordReader.nextKeyValue(PathTrackingInputFormat.java:136) at org.apache.hadoop.mapreduce.lib.input.CombineFileRecordReaderWrapper.nextKeyValue(CombineFileRecordReaderWrapper.java:90) at org.apache.hadoop.mapreduce.lib.input.CombineFileRecordReader.nextKeyValue(CombineFileRecordReader.java:65) at org.apache.spark.rdd.NewHadoopRDD$$anon$1.hasNext(NewHadoopRDD.scala:251) at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491) at org.apache.spark.internal.io.SparkHadoopWriter$.$anonfun$executeTask$1(SparkHadoopWriter.scala:136) at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1539) at org.apache.spark.internal.io.SparkHadoopWriter$.executeTask(SparkHadoopWriter.scala:135) ... 9 more Failed due to DLP rate limit, please request more quota from DLP: https://cloud.google.com/dlp/limits#increases

Allen Y. · 评论11 months之前

Manish S. · 评论11 months之前

Pramod K. · 评论11 months之前

Nakul W. · 评论11 months之前

i cant remove the chat on the page, its irritating me and i am unable to see whats wrong

Dhanush R. · 评论11 months之前

CHETAN S. · 评论11 months之前

Aryan D. · 评论11 months之前

bnnv

avinash k. · 评论11 months之前

Atindra G. · 评论11 months之前

Somes S. · 评论11 months之前

vyshnav r. · 评论11 months之前

Isuru K. · 评论11 months之前

Van Khanh N. · 评论11 months之前

Atul P. · 评论11 months之前

Rahul G. · 评论11 months之前

我们无法确保发布的评价来自已购买或已使用产品的消费者。评价未经 Google 核实。