HEX
Server: Apache/2.4.65 (Ubuntu)
System: Linux ielts-store-v2 6.8.0-1036-gcp #38~22.04.1-Ubuntu SMP Thu Aug 14 01:19:18 UTC 2025 x86_64
User: root (0)
PHP: 7.2.34-54+ubuntu20.04.1+deb.sury.org+1
Disabled: pcntl_alarm,pcntl_fork,pcntl_waitpid,pcntl_wait,pcntl_wifexited,pcntl_wifstopped,pcntl_wifsignaled,pcntl_wifcontinued,pcntl_wexitstatus,pcntl_wtermsig,pcntl_wstopsig,pcntl_signal,pcntl_signal_get_handler,pcntl_signal_dispatch,pcntl_get_last_error,pcntl_strerror,pcntl_sigprocmask,pcntl_sigwaitinfo,pcntl_sigtimedwait,pcntl_exec,pcntl_getpriority,pcntl_setpriority,pcntl_async_signals,
Upload Files
File: //snap/google-cloud-cli/396/help/man/man1/gcloud_beta_dataproc_jobs_submit.1
.TH "GCLOUD_BETA_DATAPROC_JOBS_SUBMIT" 1



.SH "NAME"
.HP
gcloud beta dataproc jobs submit \- submit Dataproc jobs to execute on a cluster



.SH "SYNOPSIS"
.HP
\f5gcloud beta dataproc jobs submit\fR \fICOMMAND\fR [\fB\-\-async\fR] [\fB\-\-bucket\fR=\fIBUCKET\fR] [\fB\-\-region\fR=\fIREGION\fR] [\fIGCLOUD_WIDE_FLAG\ ...\fR]



.SH "DESCRIPTION"

\fB(BETA)\fR Submit Dataproc jobs to execute on a cluster.



.SH "EXAMPLES"

To submit a Hadoop MapReduce job, run:

.RS 2m
$ gcloud beta dataproc jobs submit hadoop \-\-cluster my\-cluster \e
    \-\-jar my_jar.jar \-\- arg1 arg2
.RE

To submit a Spark Scala or Java job, run:

.RS 2m
$ gcloud beta dataproc jobs submit spark \-\-cluster my\-cluster \e
    \-\-jar my_jar.jar \-\- arg1 arg2
.RE

To submit a PySpark job, run:

.RS 2m
$ gcloud beta dataproc jobs submit pyspark \-\-cluster my\-cluster \e
    my_script.py \-\- arg1 arg2
.RE

To submit a Spark SQL job, run:

.RS 2m
$ gcloud beta dataproc jobs submit spark\-sql \-\-cluster my\-cluster \e
    \-\-file my_queries.q
.RE

To submit a Pig job, run:

.RS 2m
$ gcloud beta dataproc jobs submit pig \-\-cluster my\-cluster \e
    \-\-file my_script.pig
.RE

To submit a Hive job, run:

.RS 2m
$ gcloud beta dataproc jobs submit hive \-\-cluster my\-cluster \e
    \-\-file my_queries.q
.RE



.SH "FLAGS"

.RS 2m
.TP 2m
\fB\-\-async\fR

Return immediately, without waiting for the operation in progress to complete.

.TP 2m
\fB\-\-bucket\fR=\fIBUCKET\fR

The Cloud Storage bucket to stage files in. Defaults to the cluster's configured
bucket.

.TP 2m
\fB\-\-region\fR=\fIREGION\fR

Dataproc region to use. Each Dataproc region constitutes an independent resource
namespace constrained to deploying instances into Compute Engine zones inside
the region. Overrides the default \fBdataproc/region\fR property value for this
command invocation.


.RE
.sp

.SH "GCLOUD WIDE FLAGS"

These flags are available to all commands: \-\-help.

Run \fB$ gcloud help\fR for details.



.SH "COMMANDS"

\f5\fICOMMAND\fR\fR is one of the following:

.RS 2m
.TP 2m
\fBflink\fR

\fB(BETA)\fR Submit a Flink job to a cluster.

.TP 2m
\fBhadoop\fR

\fB(BETA)\fR Submit a Hadoop job to a cluster.

.TP 2m
\fBhive\fR

\fB(BETA)\fR Submit a Hive job to a cluster.

.TP 2m
\fBpig\fR

\fB(BETA)\fR Submit a Pig job to a cluster.

.TP 2m
\fBpresto\fR

\fB(BETA)\fR Submit a Presto job to a cluster.

.TP 2m
\fBpyspark\fR

\fB(BETA)\fR Submit a PySpark job to a cluster.

.TP 2m
\fBspark\fR

\fB(BETA)\fR Submit a Spark job to a cluster.

.TP 2m
\fBspark\-r\fR

\fB(BETA)\fR Submit a SparkR job to a cluster.

.TP 2m
\fBspark\-sql\fR

\fB(BETA)\fR Submit a Spark SQL job to a cluster.

.TP 2m
\fBtrino\fR

\fB(BETA)\fR Submit a Trino job to a cluster.


.RE
.sp

.SH "NOTES"

This command is currently in beta and might change without notice. These
variants are also available:

.RS 2m
$ gcloud dataproc jobs submit
.RE

.RS 2m
$ gcloud alpha dataproc jobs submit
.RE