Configurar o IAM e a rede para jobs do Dataflow avaliações

12783 avaliações

Felix W. · Revisado há almost 4 years

Error Default python version is 3.9.x but apache-beam[gcp] only support 3.8.x

Robin J. · Revisado há almost 4 years

The connection was not working for the jobs and jobs took a long time

Pradeep U. · Revisado há almost 4 years

A few of the commands are outdated.

Mal W. · Revisado há almost 4 years

File "/usr/lib/python3.9/runpy.py", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File "/usr/lib/python3.9/runpy.py", line 87, in _run_code exec(code, run_globals) File "/home/student_04_ceb375744f38/df-env/lib/python3.9/site-packages/apache_beam/examples/wordcount.py", line 103, in <module> run() File "/home/student_04_ceb375744f38/df-env/lib/python3.9/site-packages/apache_beam/examples/wordcount.py", line 98, in run output | 'Write' >> WriteToText(known_args.output) File "/home/student_04_ceb375744f38/df-env/lib/python3.9/site-packages/apache_beam/pipeline.py", line 596, in __exit__ self.result = self.run() File "/home/student_04_ceb375744f38/df-env/lib/python3.9/site-packages/apache_beam/pipeline.py", line 546, in run return Pipeline.from_runner_api( File "/home/student_04_ceb375744f38/df-env/lib/python3.9/site-packages/apache_beam/pipeline.py", line 573, in run return self.runner.run_pipeline(self, self._options) File "/home/student_04_ceb375744f38/df-env/lib/python3.9/site-packages/apache_beam/runners/dataflow/dataflow_runner.py", line 582, in run_pipeline self.dataflow_client.create_job(self.job), self) File "/home/student_04_ceb375744f38/df-env/lib/python3.9/site-packages/apache_beam/utils/retry.py", line 253, in wrapper return fun(*args, **kwargs) File "/home/student_04_ceb375744f38/df-env/lib/python3.9/site-packages/apache_beam/runners/dataflow/internal/apiclient.py", line 724, in create_job self.create_job_description(job) File "/home/student_04_ceb375744f38/df-env/lib/python3.9/site-packages/apache_beam/runners/dataflow/internal/apiclient.py", line 822, in create_job_description job.proto.environment = Environment( File "/home/student_04_ceb375744f38/df-env/lib/python3.9/site-packages/apache_beam/runners/dataflow/internal/apiclient.py", line 184, in __init__ _verify_interpreter_version_is_supported(options) File "/home/student_04_ceb375744f38/df-env/lib/python3.9/site-packages/apache_beam/runners/dataflow/internal/apiclient.py", line 1272, in _verify_interpreter_version_is_supported raise Exception(

Tristan T. · Revisado há almost 4 years

ROHIT P. · Revisado há almost 4 years

Ashish S. · Revisado há almost 4 years

Had to go off trails with --experiment use_unsupported_python_version Can't complete this lab with provided code. No time left to diagnose or correct packages. "Error syncing pod d3f2b51794d3fd272770f868250b1791 ("df-beamapp-student005a377216-02201449-5d9m-harness-5d9g_default(d3f2b51794d3fd272770f868250b1791)"), skipping: failed to "StartContainer" for "python" with CrashLoopBackOff: "back-off 10s restarting failed container=python pod=df-beamapp-student005a377216-02201449-5d9m-harness-5d9g_default(d3f2b51794d3fd272770f868250b1791)""

Jordon M. · Revisado há about 4 years

Catarina D. · Revisado há about 4 years

Balaram D. · Revisado há about 4 years

Sourav S. · Revisado há about 4 years

The lab instructions do not play well with Python 3.9, which is the default for the Cloud Shell at the moment. The Dataflow job failed twice and the lab time ran out on my third attempt.

Mohamed A. · Revisado há about 4 years

Please note the Labs seems to be intended for Python version 3.7, but most of the labs that are currently running are using Python 3.9 Version which are not compatible and fail, So we need to give this --experiment use_unsupported_python_version to avoid failures. It would be good to have this updated in the Labs... but a good learning experience that I wanted to share.

Krishna V. · Revisado há about 4 years

Not working the code provided

Souvik B. · Revisado há about 4 years

This lab use python3.9 but dataflow can run only 3.6 - 3.8 . Then I can't run this lab.

Chittanutchote W. · Revisado há about 4 years

H F. · Revisado há about 4 years

Python version in lab does not correspond to the one needed for Dataflow SDK. Tried to launch the job using command line parameter : --experiment use_unsupported_python_version But the job later crashes repeatedly on kubelet error : "Error syncing pod d3f2b51794d3fd272770f868250b1791 ("df-beamapp-student0059c260e1-02180722-gtoz-harness-hqlc_default(d3f2b51794d3fd272770f868250b1791)"), skipping: failed to "StartContainer" for "python" with CrashLoopBackOff: "back-off 10s restarting failed container=python pod=df-beamapp-student0059c260e1-02180722-gtoz-harness-hqlc_default(d3f2b51794d3fd272770f868250b1791)""

Christophe S. · Revisado há about 4 years

Javier M. · Revisado há about 4 years

Javier M. · Revisado há about 4 years

Error Default python version is 3.9.x but apache-beam[gcp] only support 3.8.x

Robin J. · Revisado há about 4 years

Sachin K. · Revisado há about 4 years

Md S. · Revisado há about 4 years

The version of Python is 3.9 and I got errors that says I need 3.8 and I could not install it, so I was not able to complete the lab. Please fix this.

Jean-Sébastien P. · Revisado há about 4 years

Alceu D. · Revisado há about 4 years

Jagdeep S. · Revisado há about 4 years

Não garantimos que as avaliações publicadas sejam de consumidores que compraram ou usaram os produtos. As avaliações não são verificadas pelo Google.