HEX
Server: Apache/2.4.65 (Ubuntu)
System: Linux ielts-store-v2 6.8.0-1036-gcp #38~22.04.1-Ubuntu SMP Thu Aug 14 01:19:18 UTC 2025 x86_64
User: root (0)
PHP: 7.2.34-54+ubuntu20.04.1+deb.sury.org+1
Disabled: pcntl_alarm,pcntl_fork,pcntl_waitpid,pcntl_wait,pcntl_wifexited,pcntl_wifstopped,pcntl_wifsignaled,pcntl_wifcontinued,pcntl_wexitstatus,pcntl_wtermsig,pcntl_wstopsig,pcntl_signal,pcntl_signal_get_handler,pcntl_signal_dispatch,pcntl_get_last_error,pcntl_strerror,pcntl_sigprocmask,pcntl_sigwaitinfo,pcntl_sigtimedwait,pcntl_exec,pcntl_getpriority,pcntl_setpriority,pcntl_async_signals,
Upload Files
File: //snap/google-cloud-cli/current/help/man/man1/gcloud_ai-platform_local_predict.1
.TH "GCLOUD_AI\-PLATFORM_LOCAL_PREDICT" 1



.SH "NAME"
.HP
gcloud ai\-platform local predict \- run prediction locally



.SH "SYNOPSIS"
.HP
\f5gcloud ai\-platform local predict\fR \fB\-\-model\-dir\fR=\fIMODEL_DIR\fR (\fB\-\-json\-instances\fR=\fIJSON_INSTANCES\fR\ |\ \fB\-\-json\-request\fR=\fIJSON_REQUEST\fR\ |\ \fB\-\-text\-instances\fR=\fITEXT_INSTANCES\fR) [\fB\-\-framework\fR=\fIFRAMEWORK\fR] [\fB\-\-signature\-name\fR=\fISIGNATURE_NAME\fR] [\fIGCLOUD_WIDE_FLAG\ ...\fR]



.SH "DESCRIPTION"

\fBgcloud ai\-platform local predict\fR performs prediction locally with the
given instances. It requires the TensorFlow SDK
(https://www.tensorflow.org/install) be installed locally. The output format
mirrors \f5gcloud ai\-platform predict\fR (online prediction).

You cannot use this command with custom prediction routines.



.SH "REQUIRED FLAGS"

.RS 2m
.TP 2m
\fB\-\-model\-dir\fR=\fIMODEL_DIR\fR

Path to the model.

.TP 2m

Exactly one of these must be specified:


.RS 2m
.TP 2m
\fB\-\-json\-instances\fR=\fIJSON_INSTANCES\fR

Path to a local file from which instances are read. Instances are in JSON
format; newline delimited.

An example of the JSON instances file:

.RS 2m
{"images": [0.0, ..., 0.1], "key": 3}
{"images": [0.0, ..., 0.1], "key": 2}
...
.RE

This flag accepts "\-" for stdin.

.TP 2m
\fB\-\-json\-request\fR=\fIJSON_REQUEST\fR

Path to a local file containing the body of JSON request.

An example of a JSON request:

.RS 2m
{
  "instances": [
    {"x": [1, 2], "y": [3, 4]},
    {"x": [\-1, \-2], "y": [\-3, \-4]}
  ]
}
.RE

This flag accepts "\-" for stdin.

.TP 2m
\fB\-\-text\-instances\fR=\fITEXT_INSTANCES\fR

Path to a local file from which instances are read. Instances are in UTF\-8
encoded text format; newline delimited.

An example of the text instances file:

.RS 2m
107,4.9,2.5,4.5,1.7
100,5.7,2.8,4.1,1.3
...
.RE

This flag accepts "\-" for stdin.


.RE
.RE
.sp

.SH "OPTIONAL FLAGS"

.RS 2m
.TP 2m
\fB\-\-framework\fR=\fIFRAMEWORK\fR

ML framework used to train this version of the model. If not specified, defaults
to 'tensorflow'. \fIFRAMEWORK\fR must be one of: \fBscikit\-learn\fR,
\fBtensorflow\fR, \fBxgboost\fR.

.TP 2m
\fB\-\-signature\-name\fR=\fISIGNATURE_NAME\fR

Name of the signature defined in the SavedModel to use for this job. Defaults to
DEFAULT_SERVING_SIGNATURE_DEF_KEY in
https://www.tensorflow.org/api_docs/python/tf/compat/v1/saved_model/signature_constants,
which is "serving_default". Only applies to TensorFlow models.


.RE
.sp

.SH "GCLOUD WIDE FLAGS"

These flags are available to all commands: \-\-access\-token\-file, \-\-account,
\-\-billing\-project, \-\-configuration, \-\-flags\-file, \-\-flatten,
\-\-format, \-\-help, \-\-impersonate\-service\-account, \-\-log\-http,
\-\-project, \-\-quiet, \-\-trace\-token, \-\-user\-output\-enabled,
\-\-verbosity.

Run \fB$ gcloud help\fR for details.



.SH "NOTES"

These variants are also available:

.RS 2m
$ gcloud alpha ai\-platform local predict
.RE

.RS 2m
$ gcloud beta ai\-platform local predict
.RE