HEX
Server: Apache/2.4.65 (Ubuntu)
System: Linux ielts-store-v2 6.8.0-1036-gcp #38~22.04.1-Ubuntu SMP Thu Aug 14 01:19:18 UTC 2025 x86_64
User: root (0)
PHP: 7.2.34-54+ubuntu20.04.1+deb.sury.org+1
Disabled: pcntl_alarm,pcntl_fork,pcntl_waitpid,pcntl_wait,pcntl_wifexited,pcntl_wifstopped,pcntl_wifsignaled,pcntl_wifcontinued,pcntl_wexitstatus,pcntl_wtermsig,pcntl_wstopsig,pcntl_signal,pcntl_signal_get_handler,pcntl_signal_dispatch,pcntl_get_last_error,pcntl_strerror,pcntl_sigprocmask,pcntl_sigwaitinfo,pcntl_sigtimedwait,pcntl_exec,pcntl_getpriority,pcntl_setpriority,pcntl_async_signals,
Upload Files
File: //snap/google-cloud-cli/396/help/man/man1/gcloud_dataproc_workflow-templates_add-job.1
.TH "GCLOUD_DATAPROC_WORKFLOW\-TEMPLATES_ADD\-JOB" 1



.SH "NAME"
.HP
gcloud dataproc workflow\-templates add\-job \- add Dataproc jobs to workflow template



.SH "SYNOPSIS"
.HP
\f5gcloud dataproc workflow\-templates add\-job\fR \fICOMMAND\fR [\fIGCLOUD_WIDE_FLAG\ ...\fR]



.SH "EXAMPLES"

To add a Hadoop MapReduce job, run:

.RS 2m
$ gcloud dataproc workflow\-templates add\-job hadoop \e
    \-\-workflow\-template my_template \-\-jar my_jar.jar \-\- arg1 arg2
.RE

To add a Spark Scala or Java job, run:

.RS 2m
$ gcloud dataproc workflow\-templates add\-job spark \e
    \-\-workflow\-template my_template \-\-jar my_jar.jar \-\- arg1 arg2
.RE

To add a PySpark job, run:

.RS 2m
$ gcloud dataproc workflow\-templates add\-job pyspark \e
    \-\-workflow\-template my_template my_script.py \-\- arg1 arg2
.RE

To add a Spark SQL job, run:

.RS 2m
$ gcloud dataproc workflow\-templates add\-job spark\-sql \e
    \-\-workflow\-template my_template \-\-file my_queries.q
.RE

To add a Pig job, run:

.RS 2m
$ gcloud dataproc workflow\-templates add\-job pig \e
    \-\-workflow\-template my_template \-\-file my_script.pig
.RE

To add a Hive job, run:

.RS 2m
$ gcloud dataproc workflow\-templates add\-job hive \e
    \-\-workflow\-template my_template \-\-file my_queries.q
.RE



.SH "GCLOUD WIDE FLAGS"

These flags are available to all commands: \-\-help.

Run \fB$ gcloud help\fR for details.



.SH "COMMANDS"

\f5\fICOMMAND\fR\fR is one of the following:

.RS 2m
.TP 2m
\fBhadoop\fR

Add a hadoop job to the workflow template.

.TP 2m
\fBhive\fR

Add a Hive job to the workflow template.

.TP 2m
\fBpig\fR

Add a Pig job to the workflow template.

.TP 2m
\fBpresto\fR

Add a Presto job to the workflow template.

.TP 2m
\fBpyspark\fR

Add a PySpark job to the workflow template.

.TP 2m
\fBspark\fR

Add a Spark job to the workflow template.

.TP 2m
\fBspark\-r\fR

Add a SparkR job to the workflow template.

.TP 2m
\fBspark\-sql\fR

Add a SparkSql job to the workflow template.

.TP 2m
\fBtrino\fR

Add a Trino job to the workflow template.


.RE
.sp

.SH "NOTES"

These variants are also available:

.RS 2m
$ gcloud alpha dataproc workflow\-templates add\-job
.RE

.RS 2m
$ gcloud beta dataproc workflow\-templates add\-job
.RE