Serverless Data Analysis with Dataflow: Side Inputs (Python) Rezensionen
41228 Rezensionen
Victor D. · Vor mehr als ein Jahr überprüft
Guy S. · Vor mehr als ein Jahr überprüft
Jose I. R. · Vor mehr als ein Jahr überprüft
能静 T. · Vor mehr als ein Jahr überprüft
Ankala J. · Vor mehr als ein Jahr überprüft
Anthony K. · Vor mehr als ein Jahr überprüft
Preethi G. · Vor mehr als ein Jahr überprüft
Anupam P. · Vor mehr als ein Jahr überprüft
Gokulraj M. · Vor mehr als ein Jahr überprüft
BIDHU S. · Vor mehr als ein Jahr überprüft
Himanshu G. · Vor mehr als ein Jahr überprüft
Matias R. · Vor mehr als ein Jahr überprüft
Cameron L. · Vor mehr als ein Jahr überprüft
Maximilian L. · Vor mehr als ein Jahr überprüft
Gajendra B. · Vor mehr als ein Jahr überprüft
Vignesh C. · Vor mehr als ein Jahr überprüft
Aswini D. · Vor mehr als ein Jahr überprüft
Anupam P. · Vor mehr als ein Jahr überprüft
Amar K. · Vor mehr als ein Jahr überprüft
Anupam P. · Vor mehr als ein Jahr überprüft
i'm facing the below error i'm not able to complete the lab please help me to go through this error ''' student-01-0f2c45cb00c1@training-vm:~/training-data-analyst/courses/data_analysis/lab2/python$ python3 JavaProjectsThatNeedHelp.py --bucket $BUCKET --project $PROJECT --region $REGION --DirectRunner /home/student-01-0f2c45cb00c1/training-data-analyst/courses/data_analysis/lab2/python/JavaProjectsThatNeedHelp.py:166: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead. bigqcollection = p | 'ReadFromBQ' >> beam.io.Read(beam.io.BigQuerySource(project=project,query=get_java_query)) /usr/local/lib/python3.9/dist-packages/apache_beam/io/gcp/bigquery.py:2485: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( Traceback (most recent call last): File "apache_beam/runners/common.py", line 1418, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 624, in apache_beam.runners.common.SimpleInvoker.invoke_process File "apache_beam/runners/common.py", line 1572, in apache_beam.runners.common._OutputHandler.handle_process_outputs File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/worker/bundle_processor.py", line 1515, in process for part, size in self.restriction_provider.split_and_size( File "/usr/local/lib/python3.9/dist-packages/apache_beam/transforms/core.py", line 335, in split_and_size for part in self.split(element, restriction): File "/usr/local/lib/python3.9/dist-packages/apache_beam/io/iobase.py", line 1641, in split estimated_size = restriction.source().estimate_size() File "/usr/local/lib/python3.9/dist-packages/apache_beam/io/gcp/bigquery.py", line 721, in estimate_size job = bq._start_query_job( File "/usr/local/lib/python3.9/dist-packages/apache_beam/utils/retry.py", line 275, in wrapper return fun(*args, **kwargs) File "/usr/local/lib/python3.9/dist-packages/apache_beam/io/gcp/bigquery_tools.py", line 615, in _start_query_job return self._start_job(request) File "/usr/local/lib/python3.9/dist-packages/apache_beam/io/gcp/bigquery_tools.py", line 561, in _start_job response = self.client.jobs.Insert(request, upload=upload) File "/usr/local/lib/python3.9/dist-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py", line 343, in Insert return self._RunMethod( File "/usr/local/lib/python3.9/dist-packages/apitools/base/py/base_api.py", line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "/usr/local/lib/python3.9/dist-packages/apitools/base/py/base_api.py", line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "/usr/local/lib/python3.9/dist-packages/apitools/base/py/base_api.py", line 603, in __ProcessHttpResponse raise exceptions.HttpError.FromResponse( apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing <https://bigquery.googleapis.com/bigquery/v2/projects/qwiklabs-gcp-00-929110efe78e/jobs?alt=json>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Wed, 24 Jul 2024 11:18:53 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '403', 'content-length': '502', '-content-encoding': 'gzip'}>, content <{ "error": { "code": 403, "message": "Access Denied: Project qwiklabs-gcp-00-929110efe78e: User does not have bigquery.jobs.create permission in project qwiklabs-gcp-00-929110efe78e.", "errors": [ { "message": "Access Denied: Project qwiklabs-gcp-00-929110efe78e: User does not have bigquery.jobs.create permission in project qwiklabs-gcp-00-929110efe78e.", "domain": "global", "reason": "accessDenied" } ], "status": "PERMISSION_DENIED" } } > During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/student-01-0f2c45cb00c1/training-data-analyst/courses/data_analysis/lab2/python/JavaProjectsThatNeedHelp.py", line 190, in <module> run() File "/home/student-01-0f2c45cb00c1/training-data-analyst/courses/data_analysis/lab2/python/JavaProjectsThatNeedHelp.py", line 185, in run p.run().wait_until_finish() File "/usr/local/lib/python3.9/dist-packages/apache_beam/pipeline.py", line 577, in run return self.runner.run_pipeline(self, self._options) File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/direct/direct_runner.py", line 131, in run_pipeline return runner.run_pipeline(pipeline, options) File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 202, in run_pipeline self._latest_run_result = self.run_via_runner_api( File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 224, in run_via_runner_api return self.run_stages(stage_context, stages) File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 455, in run_stages bundle_results = self._execute_bundle( File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 783, in _execute_bundle self._run_bundle( File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 1012, in _run_bundle result, splits = bundle_manager.process_bundle( File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/portability/fn_api_runner/fn_runner.py", line 1348, in process_bundle result_future = self._worker_handler.control_conn.push(process_bundle_req) File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/portability/fn_api_runner/worker_handlers.py", line 379, in push response = self.worker.do_instruction(request) File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/worker/sdk_worker.py", line 624, in do_instruction return getattr(self, request_type)( File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/worker/sdk_worker.py", line 662, in process_bundle bundle_processor.process_bundle(instruction_id)) File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/worker/bundle_processor.py", line 1062, in process_bundle input_op_by_transform_id[element.transform_id].process_encoded( File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/worker/bundle_processor.py", line 232, in process_encoded self.output(decoded_value) File "apache_beam/runners/worker/operations.py", line 526, in apache_beam.runners.worker.operations.Operation.output File "apache_beam/runners/worker/operations.py", line 528, in apache_beam.runners.worker.operations.Operation.output File "apache_beam/runners/worker/operations.py", line 237, in apache_beam.runners.worker.operations.SingletonElementConsumerSet.receive File "apache_beam/runners/worker/operations.py", line 240, in apache_beam.runners.worker.operations.SingletonElementConsumerSet.receive File "apache_beam/runners/worker/operations.py", line 907, in apache_beam.runners.worker.operations.DoOperation.process File "apache_beam/runners/worker/operations.py", line 908, in apache_beam.runners.worker.operations.DoOperation.process File "apache_beam/runners/common.py", line 1420, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 1492, in apache_beam.runners.common.DoFnRunner._reraise_augmented File "apache_beam/runners/common.py", line 1418, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 624, in apache_beam.runners.common.SimpleInvoker.invoke_process File "apache_beam/runners/common.py", line 1582, in apache_beam.runners.common._OutputHandler.handle_process_outputs File "apache_beam/runners/common.py", line 1695, in apache_beam.runners.common._OutputHandler._write_value_to_tag File "apache_beam/runners/worker/operations.py", line 240, in apache_beam.runners.worker.operations.SingletonElementConsumerSet.receive File "apache_beam/runners/worker/operations.py", line 907, in apache_beam.runners.worker.operations.DoOperation.process File "apache_beam/runners/worker/operations.py", line 908, in apache_beam.runners.worker.operations.DoOperation.process File "apache_beam/runners/common.py", line 1420, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 1492, in apache_beam.runners.common.DoFnRunner._reraise_augmented File "apache_beam/runners/common.py", line 1418, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 624, in apache_beam.runners.common.SimpleInvoker.invoke_process File "apache_beam/runners/common.py", line 1582, in apache_beam.runners.common._OutputHandler.handle_process_outputs File "apache_beam/runners/common.py", line 1695, in apache_beam.runners.common._OutputHandler._write_value_to_tag File "apache_beam/runners/worker/operations.py", line 240, in apache_beam.runners.worker.operations.SingletonElementConsumerSet.receive File "apache_beam/runners/worker/operations.py", line 907, in apache_beam.runners.worker.operations.DoOperation.process File "apache_beam/runners/worker/operations.py", line 908, in apache_beam.runners.worker.operations.DoOperation.process File "apache_beam/runners/common.py", line 1420, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 1508, in apache_beam.runners.common.DoFnRunner._reraise_augmented File "apache_beam/runners/common.py", line 1418, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 624, in apache_beam.runners.common.SimpleInvoker.invoke_process File "apache_beam/runners/common.py", line 1572, in apache_beam.runners.common._OutputHandler.handle_process_outputs File "/usr/local/lib/python3.9/dist-packages/apache_beam/runners/worker/bundle_processor.py", line 1515, in process for part, size in self.restriction_provider.split_and_size( File "/usr/local/lib/python3.9/dist-packages/apache_beam/transforms/core.py", line 335, in split_and_size for part in self.split(element, restriction): File "/usr/local/lib/python3.9/dist-packages/apache_beam/io/iobase.py", line 1641, in split estimated_size = restriction.source().estimate_size() File "/usr/local/lib/python3.9/dist-packages/apache_beam/io/gcp/bigquery.py", line 721, in estimate_size job = bq._start_query_job( File "/usr/local/lib/python3.9/dist-packages/apache_beam/utils/retry.py", line 275, in wrapper return fun(*args, **kwargs) File "/usr/local/lib/python3.9/dist-packages/apache_beam/io/gcp/bigquery_tools.py", line 615, in _start_query_job return self._start_job(request) File "/usr/local/lib/python3.9/dist-packages/apache_beam/io/gcp/bigquery_tools.py", line 561, in _start_job response = self.client.jobs.Insert(request, upload=upload) File "/usr/local/lib/python3.9/dist-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py", line 343, in Insert return self._RunMethod( File "/usr/local/lib/python3.9/dist-packages/apitools/base/py/base_api.py", line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "/usr/local/lib/python3.9/dist-packages/apitools/base/py/base_api.py", line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "/usr/local/lib/python3.9/dist-packages/apitools/base/py/base_api.py", line 603, in __ProcessHttpResponse raise exceptions.HttpError.FromResponse( RuntimeError: apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing <https://bigquery.googleapis.com/bigquery/v2/projects/qwiklabs-gcp-00-929110efe78e/jobs?alt=json>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Wed, 24 Jul 2024 11:18:53 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '403', 'content-length': '502', '-content-encoding': 'gzip'}>, content <{ "error": { "code": 403, "message": "Access Denied: Project qwiklabs-gcp-00-929110efe78e: User does not have bigquery.jobs.create permission in project qwiklabs-gcp-00-929110efe78e.", "errors": [ { "message": "Access Denied: Project qwiklabs-gcp-00-929110efe78e: User does not have bigquery.jobs.create permission in project qwiklabs-gcp-00-929110efe78e.", "domain": "global", "reason": "accessDenied" } ], "status": "PERMISSION_DENIED" } } > [while running 'ReadFromBQ/ReadFromBigQuery/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction'] '''
Gajendra B. · Vor mehr als ein Jahr überprüft
Isaac G. · Vor mehr als ein Jahr überprüft
Great Lab
Mohamed R. · Vor mehr als ein Jahr überprüft
Great Lab
Mohamed R. · Vor mehr als ein Jahr überprüft
Julien B. · Vor mehr als ein Jahr überprüft
Wir können nicht garantieren, dass die veröffentlichten Rezensionen von Verbrauchern stammen, die die Produkte gekauft oder genutzt haben. Die Rezensionen werden von Google nicht überprüft.