Skip to main content

Questions tagged [tensorflow-serving]

The tag has no usage guidance, but it has a tag wiki.

tensorflow-serving
0 votes
0 answers
11 views

Compute specs required to build the tensorflow serving docker image

I'm trying to build the Tensorflow Serving docker image from scratch. Currently, I'm burning every CPU I can attempt to spin up. There is no official documentation regarding this. Does anyone have any ...
Bilaal Rashid's user avatar
0 votes
0 answers
22 views

Issue: StatusCode.FAILED_PRECONDITION

I try to serve model on TFserver through docker tensorflow/serving:2.16.1 and got this issue: `<_InactiveRpcError of RPC that terminated with: status = StatusCode.FAILED_PRECONDITION details = &...
Nhựt Tiến's user avatar
0 votes
0 answers
8 views

Tensorflow Serving: Adding warm start data at runtime

I'm trying to dynamically add warm-start data for our models via the SavedModel Warmup method (https://www.tensorflow.org/tfx/serving/saved_model_warmup). In our case, we need to actually have the ...
trevoryao's user avatar
0 votes
0 answers
11 views

Multiple Model Configuration in tensorflow serving

i have created a file model.config for configuration detail and it is inside model_config and model_config is inside untitlled folder from where iam executing my script and getting the path not ...
Sawan Rawat's user avatar
0 votes
0 answers
12 views

Handle label feature in TFX in different environments

I'm new in MLOps and trying to figure out how to work with label feature in data. I read that for the uniformity of the data it is necessary to use the same schema for both the training and validation ...
AnnacKK's user avatar
0 votes
0 answers
32 views

Ragged Tensor as an output from Tensorflow serving

We use tensorflow serving to serve models in production. We have a use case where the output of the model is a ragged tensor. To see if the tensorflow serving supports ragged tensor as output, we ...
Ritesh's user avatar
  • 497
0 votes
1 answer
15 views

Is there another loss that can replace seq2seq.sequence_loss in tensorflow

I am running a CVAE for text generation. I am using tensorflow > 2.0. The problem is that for my loss I use the seq2seq.sequence_loss. I tried to update the tensorflow v1 to v2 since the code was ...
svmmy_776's user avatar
0 votes
0 answers
14 views

Why Tensorflow SavedModel is a directory

People who designed the SavedModelBundle definitely had a very good reason to make it export and load models to a directory on a disk. However doesn't this defeat the very important purpose of ...
Arian Maghsoudnia's user avatar
0 votes
0 answers
42 views

Error with loading a .pb model for prediction: Op type not registered 'DecodeProtoSparseV4'

By running this command to load a trained model (saved_model.pb) for doing the prediction: model_directory = 'C:/Users/.../predict/001/' model = tf.saved_model.load(model_directory) I get the ...
user7's user avatar
  • 1
1 vote
0 answers
91 views

Serving keras model with tensorflow serving error

I have created a model using keras which is working locally. But after upgrade of tensorflow to the 2.17.0 I start receiving strange error on the tsserve side. Model was serialized so: import ...
Oleg's user avatar
  • 3,140
0 votes
1 answer
77 views

TFLite converter not replacing dummy_function with TFLite_Detection_PostProcess

I want to implement a tf.Module for decoding box predictions and applying NonMaxSuppression that is convertible to tflite. This implementation includes elements from here. It also follows this guide ...
Robert Sundermeyer's user avatar
0 votes
0 answers
85 views

Tensorflow Serving prometheus metrics is unclear and latency is high

Background We have deployed a service with Tensorflow Serving (TFServing) container to Kubernetes with server-side batching enabled. When the service receives a inference request, it invokes TFServing ...
Jun.Z's user avatar
  • 1
0 votes
0 answers
22 views

Key and output issues with Tensorflow serving

My TF serving script is not predicting the correct result. I am sending a request through grpc, here is my full code: import grpc import numpy as np import tensorflow as tf from tensorflow_serving....
Timcho's user avatar
  • 3
0 votes
0 answers
66 views

How can I display logs for models served by TensorFlow Serving using GRPC?

I am using TensorFlow Serving and the GRPC pattern for serving models. However, I have encountered problems in displaying logs for the served models. While I've successfully used the env var ...
Leonardo's user avatar
  • 178
0 votes
1 answer
90 views

Error in HTTP POST request from Flask docker container to Tensorflow/Serving container

I am trying to deploy my micro projects using docker compose on AWS EC2. The images of major containers are Nginx, Flask, MySQL, and Tensorflow/Serving. I have successfully made connection between ...
J. Song's user avatar

15 30 50 per page
1
2 3 4 5
84