HEX
Server: Apache/2.4.65 (Ubuntu)
System: Linux ielts-store-v2 6.8.0-1036-gcp #38~22.04.1-Ubuntu SMP Thu Aug 14 01:19:18 UTC 2025 x86_64
User: root (0)
PHP: 7.2.34-54+ubuntu20.04.1+deb.sury.org+1
Disabled: pcntl_alarm,pcntl_fork,pcntl_waitpid,pcntl_wait,pcntl_wifexited,pcntl_wifstopped,pcntl_wifsignaled,pcntl_wifcontinued,pcntl_wexitstatus,pcntl_wtermsig,pcntl_wstopsig,pcntl_signal,pcntl_signal_get_handler,pcntl_signal_dispatch,pcntl_get_last_error,pcntl_strerror,pcntl_sigprocmask,pcntl_sigwaitinfo,pcntl_sigtimedwait,pcntl_exec,pcntl_getpriority,pcntl_setpriority,pcntl_async_signals,
Upload Files
File: //snap/google-cloud-cli/current/help/man/man1/gcloud_dataflow_jobs_run.1
.TH "GCLOUD_DATAFLOW_JOBS_RUN" 1



.SH "NAME"
.HP
gcloud dataflow jobs run \- runs a job from the specified path



.SH "SYNOPSIS"
.HP
\f5gcloud dataflow jobs run\fR \fIJOB_NAME\fR \fB\-\-gcs\-location\fR=\fIGCS_LOCATION\fR [\fB\-\-additional\-experiments\fR=[\fIADDITIONAL_EXPERIMENTS\fR,...]] [\fB\-\-additional\-user\-labels\fR=[\fIADDITIONAL_USER_LABELS\fR,...]] [\fB\-\-dataflow\-kms\-key\fR=\fIDATAFLOW_KMS_KEY\fR] [\fB\-\-disable\-public\-ips\fR] [\fB\-\-enable\-streaming\-engine\fR] [\fB\-\-max\-workers\fR=\fIMAX_WORKERS\fR] [\fB\-\-network\fR=\fINETWORK\fR] [\fB\-\-num\-workers\fR=\fINUM_WORKERS\fR] [\fB\-\-parameters\fR=[\fIPARAMETERS\fR,...]] [\fB\-\-region\fR=\fIREGION_ID\fR] [\fB\-\-service\-account\-email\fR=\fISERVICE_ACCOUNT_EMAIL\fR] [\fB\-\-staging\-location\fR=\fISTAGING_LOCATION\fR] [\fB\-\-subnetwork\fR=\fISUBNETWORK\fR] [\fB\-\-worker\-machine\-type\fR=\fIWORKER_MACHINE_TYPE\fR] [[\fB\-\-[no\-]update\fR\ :\ \fB\-\-transform\-name\-mappings\fR=[\fITRANSFORM_NAME_MAPPINGS\fR,...]]] [\fB\-\-worker\-region\fR=\fIWORKER_REGION\fR\ |\ \fB\-\-worker\-zone\fR=\fIWORKER_ZONE\fR\ |\ \fB\-\-zone\fR=\fIZONE\fR] [\fIGCLOUD_WIDE_FLAG\ ...\fR]



.SH "DESCRIPTION"

Runs a job from the specified path.



.SH "POSITIONAL ARGUMENTS"

.RS 2m
.TP 2m
\fIJOB_NAME\fR

The unique name to assign to the job.


.RE
.sp

.SH "REQUIRED FLAGS"

.RS 2m
.TP 2m
\fB\-\-gcs\-location\fR=\fIGCS_LOCATION\fR

The Google Cloud Storage location of the job template to run. (Must be a URL
beginning with 'gs://'.)


.RE
.sp

.SH "OPTIONAL FLAGS"

.RS 2m
.TP 2m
\fB\-\-additional\-experiments\fR=[\fIADDITIONAL_EXPERIMENTS\fR,...]

Additional experiments to pass to the job. These experiments are appended to any
experiments already set by the template.

.TP 2m
\fB\-\-additional\-user\-labels\fR=[\fIADDITIONAL_USER_LABELS\fR,...]

Additional user labels to pass to the job. Example:
\-\-additional\-user\-labels='key1=value1,key2=value2'

.TP 2m
\fB\-\-dataflow\-kms\-key\fR=\fIDATAFLOW_KMS_KEY\fR

The Cloud KMS key to protect the job resources.

.TP 2m
\fB\-\-disable\-public\-ips\fR

The Cloud Dataflow workers must not use public IP addresses. Overrides the
default \fBdataflow/disable_public_ips\fR property value for this command
invocation.

.TP 2m
\fB\-\-enable\-streaming\-engine\fR

Enabling Streaming Engine for the streaming job. Overrides the default
\fBdataflow/enable_streaming_engine\fR property value for this command
invocation.

.TP 2m
\fB\-\-max\-workers\fR=\fIMAX_WORKERS\fR

The maximum number of workers to run.

.TP 2m
\fB\-\-network\fR=\fINETWORK\fR

The Compute Engine network for launching instances to run your pipeline.

.TP 2m
\fB\-\-num\-workers\fR=\fINUM_WORKERS\fR

The initial number of workers to use.

.TP 2m
\fB\-\-parameters\fR=[\fIPARAMETERS\fR,...]

The parameters to pass to the job.

.TP 2m
\fB\-\-region\fR=\fIREGION_ID\fR

Region ID of the job's regional endpoint. Defaults to 'us\-central1'.

.TP 2m
\fB\-\-service\-account\-email\fR=\fISERVICE_ACCOUNT_EMAIL\fR

The service account to run the workers as.

.TP 2m
\fB\-\-staging\-location\fR=\fISTAGING_LOCATION\fR

The Google Cloud Storage location to stage temporary files. (Must be a URL
beginning with 'gs://'.)

.TP 2m
\fB\-\-subnetwork\fR=\fISUBNETWORK\fR

The Compute Engine subnetwork for launching instances to run your pipeline.

.TP 2m
\fB\-\-worker\-machine\-type\fR=\fIWORKER_MACHINE_TYPE\fR

The type of machine to use for workers. Defaults to server\-specified.

.TP 2m
\fB\-\-[no\-]update\fR

Set this to true for streaming update jobs. Use \fB\-\-update\fR to enable and
\fB\-\-no\-update\fR to disable.

.TP 2m
\fB\-\-transform\-name\-mappings\fR=[\fITRANSFORM_NAME_MAPPINGS\fR,...]

Transform name mappings for the streaming update job.

.TP 2m

Worker location options.

At most one of these can be specified:


.RS 2m
.TP 2m
\fB\-\-worker\-region\fR=\fIWORKER_REGION\fR

The region to run the workers in.

.TP 2m
\fB\-\-worker\-zone\fR=\fIWORKER_ZONE\fR

The zone to run the workers in.

.TP 2m
\fB\-\-zone\fR=\fIZONE\fR

(DEPRECATED) The zone to run the workers in.

The \-\-zone option is deprecated; use \-\-worker\-region or \-\-worker\-zone
instead.


.RE
.RE
.sp

.SH "GCLOUD WIDE FLAGS"

These flags are available to all commands: \-\-access\-token\-file, \-\-account,
\-\-billing\-project, \-\-configuration, \-\-flags\-file, \-\-flatten,
\-\-format, \-\-help, \-\-impersonate\-service\-account, \-\-log\-http,
\-\-project, \-\-quiet, \-\-trace\-token, \-\-user\-output\-enabled,
\-\-verbosity.

Run \fB$ gcloud help\fR for details.



.SH "NOTES"

This variant is also available:

.RS 2m
$ gcloud beta dataflow jobs run
.RE