HEX
Server: Apache/2.4.65 (Ubuntu)
System: Linux ielts-store-v2 6.8.0-1036-gcp #38~22.04.1-Ubuntu SMP Thu Aug 14 01:19:18 UTC 2025 x86_64
User: root (0)
PHP: 7.2.34-54+ubuntu20.04.1+deb.sury.org+1
Disabled: pcntl_alarm,pcntl_fork,pcntl_waitpid,pcntl_wait,pcntl_wifexited,pcntl_wifstopped,pcntl_wifsignaled,pcntl_wifcontinued,pcntl_wexitstatus,pcntl_wtermsig,pcntl_wstopsig,pcntl_signal,pcntl_signal_get_handler,pcntl_signal_dispatch,pcntl_get_last_error,pcntl_strerror,pcntl_sigprocmask,pcntl_sigwaitinfo,pcntl_sigtimedwait,pcntl_exec,pcntl_getpriority,pcntl_setpriority,pcntl_async_signals,
Upload Files
File: //snap/google-cloud-cli/396/help/man/man1/gcloud_beta_dataproc_sessions_create.1
.TH "GCLOUD_BETA_DATAPROC_SESSIONS_CREATE" 1



.SH "NAME"
.HP
gcloud beta dataproc sessions create \- create a Dataproc session



.SH "SYNOPSIS"
.HP
\f5gcloud beta dataproc sessions create\fR \fICOMMAND\fR [\fB\-\-async\fR] [\fB\-\-container\-image\fR=\fICONTAINER_IMAGE\fR] [\fB\-\-history\-server\-cluster\fR=\fIHISTORY_SERVER_CLUSTER\fR] [\fB\-\-kernel\fR=\fIKERNEL\fR] [\fB\-\-kms\-key\fR=\fIKMS_KEY\fR] [\fB\-\-labels\fR=[\fIKEY\fR=\fIVALUE\fR,...]] [\fB\-\-max\-idle\fR=\fIMAX_IDLE\fR] [\fB\-\-metastore\-service\fR=\fIMETASTORE_SERVICE\fR] [\fB\-\-property\fR=[\fIPROPERTY\fR=\fIVALUE\fR,...]] [\fB\-\-request\-id\fR=\fIREQUEST_ID\fR] [\fB\-\-service\-account\fR=\fISERVICE_ACCOUNT\fR] [\fB\-\-session_template\fR=\fISESSION_TEMPLATE\fR] [\fB\-\-staging\-bucket\fR=\fISTAGING_BUCKET\fR] [\fB\-\-tags\fR=[\fITAGS\fR,...]] [\fB\-\-ttl\fR=\fITTL\fR] [\fB\-\-user\-workload\-authentication\-type\fR=\fIUSER_WORKLOAD_AUTHENTICATION_TYPE\fR] [\fB\-\-version\fR=\fIVERSION\fR] [\fB\-\-network\fR=\fINETWORK\fR\ |\ \fB\-\-subnet\fR=\fISUBNET\fR] [\fIGCLOUD_WIDE_FLAG\ ...\fR]



.SH "DESCRIPTION"

\fB(BETA)\fR Create various sessions, such as Spark.



.SH "EXAMPLES"

To create a Spark session, run:

.RS 2m
$ gcloud beta dataproc sessions create spark my\-session \e
    \-\-location='us\-central1'
.RE



.SH "FLAGS"

.RS 2m
.TP 2m
\fB\-\-async\fR

Return immediately without waiting for the operation in progress to complete.

.TP 2m
\fB\-\-container\-image\fR=\fICONTAINER_IMAGE\fR

Optional custom container image to use for the batch/session runtime
environment. If not specified, a default container image will be used. The value
should follow the container image naming format:
{registry}/{repository}/{name}:{tag}, for example,
gcr.io/my\-project/my\-image:1.2.3

.TP 2m
\fB\-\-history\-server\-cluster\fR=\fIHISTORY_SERVER_CLUSTER\fR

Spark History Server configuration for the batch/session job. Resource name of
an existing Dataproc cluster to act as a Spark History Server for the workload
in the format: "projects/{project_id}/regions/{region}/clusters/{cluster_name}".

.TP 2m
\fB\-\-kernel\fR=\fIKERNEL\fR

Jupyter kernel type. The value could be "python" or "scala". \fIKERNEL\fR must
be one of: \fBpython\fR, \fBscala\fR.

.TP 2m
\fB\-\-kms\-key\fR=\fIKMS_KEY\fR

Cloud KMS key to use for encryption.

.TP 2m
\fB\-\-labels\fR=[\fIKEY\fR=\fIVALUE\fR,...]

List of label KEY=VALUE pairs to add.

Keys must start with a lowercase character and contain only hyphens (\f5\-\fR),
underscores (\f5_\fR), lowercase characters, and numbers. Values must contain
only hyphens (\f5\-\fR), underscores (\f5_\fR), lowercase characters, and
numbers.

.TP 2m
\fB\-\-max\-idle\fR=\fIMAX_IDLE\fR

The duration after which an idle session will be automatically terminated, for
example, "20m" or "2h". A session is considered idle if it has no active Spark
applications and no active Jupyter kernels. Run gcloud topic datetimes
(https://cloud.google.com/sdk/gcloud/reference/topic/datetimes) for information
on duration formats.

.TP 2m
\fB\-\-metastore\-service\fR=\fIMETASTORE_SERVICE\fR

Name of a Dataproc Metastore service to be used as an external metastore in the
format: "projects/{project\-id}/locations/{region}/services/{service\-name}".

.TP 2m
\fB\-\-property\fR=[\fIPROPERTY\fR=\fIVALUE\fR,...]

Specifies configuration properties.

.TP 2m
\fB\-\-request\-id\fR=\fIREQUEST_ID\fR

A unique ID that identifies the request. If the service receives two session
create requests with the same request_id, the second request is ignored and the
operation that corresponds to the first session created and stored in the
backend is returned. Recommendation: Always set this value to a UUID. The value
must contain only letters (a\-z, A\-Z), numbers (0\-9), underscores (\fI), and
hyphens (\-). The maximum length is 40 characters.

.TP 2m
\fB\-\-service\-account\fR=\fRSERVICE_ACCOUNT\fI

The IAM service account to be used for a batch/session job.

.TP 2m
\fB\-\-session_template\fR=\fRSESSION_TEMPLATE\fI

The session template to use for creating the session.

.TP 2m
\fB\-\-staging\-bucket\fR=\fRSTAGING_BUCKET\fI

The Cloud Storage bucket to use to store job dependencies, config files, and job
driver console output. If not specified, the default [staging bucket]
(https://cloud.google.com/dataproc\-serverless/docs/concepts/buckets) is used.

.TP 2m
\fB\-\-tags\fR=[\fRTAGS\fI,...]

Network tags for traffic control.

.TP 2m
\fB\-\-ttl\fR=\fRTTL\fI

The duration after the workload will be unconditionally terminated, for example,
\'20m' or '1h'. Run gcloud topic datetimes
(https://cloud.google.com/sdk/gcloud/reference/topic/datetimes) for information
on duration formats.

.TP 2m
\fB\-\-user\-workload\-authentication\-type\fR=\fRUSER_WORKLOAD_AUTHENTICATION_TYPE\fI

Whether to use END_USER_CREDENTIALS or SERVICE_ACCOUNT to run the workload.

.TP 2m
\fB\-\-version\fR=\fRVERSION\fI

Optional runtime version. If not specified, a default version will be used.

.TP 2m

At most one of these can be specified:


.RS 2m
.TP 2m
\fB\-\-network\fR=\fRNETWORK\fI

Network URI to connect network to.

.TP 2m
\fB\-\-subnet\fR=\fRSUBNET\fI

Subnetwork URI to connect network to. Subnet must have Private Google Access
enabled.


\fR
.RE
.RE
.sp

.SH "GCLOUD WIDE FLAGS"

These flags are available to all commands: \-\-help.

Run \fB$ gcloud help\fR for details.



.SH "COMMANDS"

\f5\fICOMMAND\fR\fR is one of the following:

.RS 2m
.TP 2m
\fBspark\fR

\fB(BETA)\fR Create a Spark session.


.RE
.sp

.SH "NOTES"

This command is currently in beta and might change without notice.