BentoML: TypeError: 'TfKerasModelArtifact' object is not iterable

Hi there. Here’s my packaging code:

%%writefile text_classification_service.py
import pandas as pd
from tensorflow import keras
from sklearn.preprocessing import LabelEncoder
from sklearn.feature_extraction.text import CountVectorizer
from string import digits
from bentoml import api, env, BentoService, artifacts
from bentoml.artifact import TfKerasModelArtifact
from bentoml.handlers import JsonHandler

@artifacts([TfKerasModelArtifact('model')])
@env(conda_dependencies=['tensorflow', 'pandas', 'scikit-learn'])
class TextClassificationService(BentoService):
    
    def vectorizer():
        vectorizer = CountVectorizer(stop_words=None, lowercase=True,
                             ngram_range=(1, 1), min_df=2, binary=True)
        
        train = pd.read_csv('https://raw.githubusercontent.com/Nilabhra/kolkata_nlp_workshop_2019/master/data/train.csv')
        vectorizer.fit_transform(train['text'])
        return vectorizer
    
    def remove_digits(s):
        remove_digits = str.maketrans('', '', digits)
        res = s.translate(remove_digits)
        return res
    
    @api(JsonHandler)
    def predict(self, parsed_json):
        text = parsed_json['text']
        text = remove_digits(text)
        vectorizer = vectorizer()
        text = vectorizer.transform(text)
        prediction =  self.artifacts.model.predict_classes(text)
        response = {'Sentiment': prediction}
        return response

And when I am building the archive using -

from text_classification_service import TextClassificationService

svc = TextClassificationService.pack(model=model)
saved_path = svc.save('/tmp/bento')
print(saved_path)

It gives me the following error (with trace):

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-28-39d1075ec116> in <module>
      1 from text_classification_service import TextClassificationService
      2 
----> 3 svc = TextClassificationService.pack(model=model)
      4 saved_path = svc.save('/tmp/bento')
      5 print(saved_path)

/miniconda3/lib/python3.7/site-packages/bentoml/service.py in pack(cls, *args, **kwargs)
    320         artifacts = ArtifactCollection()
    321 
--> 322         for artifact_spec in cls._artifacts_spec:
    323             if artifact_spec.name in kwargs:
    324                 artifact_instance = artifact_spec.pack(kwargs[artifact_spec.name])

TypeError: 'TfKerasModelArtifact' object is not iterable

Here is the code for model building, compilation and fitting:

model = keras.Sequential()

model.add(Dropout(rate=0.2, input_shape=features.shape[1:]))
for _ in range(2):
        model.add(Dense(units=64, activation='relu'))
        model.add(Dropout(rate=0.2))
model.add(Dense(units=1, activation='sigmoid'))

model.compile(optimizer='adam',
              loss='binary_crossentropy',
              metrics=['acc'])

es_cb = keras.callbacks.EarlyStopping(monitor='val_loss', patience=5)

model.fit(features,
                    labels,
                    epochs=15,
                    batch_size=512,
                    validation_data=(test_features, test_labels),
                    callbacks=[es_cb],
                    verbose=1)

Is there any ArtiFact I am missing here?

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Comments: 15 (9 by maintainers)

Commits related to this issue

Most upvoted comments

@yubozhao @parano it works as expected now. Here is the repository all updated. Thank you very much to both of you for your sincere and generous help. Really appreciated.

@sayakpaul yes and you can build BentoML locally by:

git clone https://github.com/bentoml/BentoML.git
cd BentoML

# fetch the remote branch pull request #97 
git fetch origin pull/97/head:pr-97

# switch to branch
git checkout pr-97

# install bentoML with local changes
pip install .

I will also add a local development document next week to make it easier for people who want to contribute to BentoML.

@sayakpaul Hey, I create a quick solution that will solve this. https://github.com/bentoml/BentoML/pull/97

What I did is a create a TF model wrapper that will load and predict with the same session and graph. I am not sure this is a good long term solution for this problem. I think a good long term solution might be separate the handle request layer (flask/other front end) and inferencing server(tensorflow serving/etc).

I tested the patch locally with your notebook. I will do few more test and then merge this PR. If you are able to test the your notebook with this branch and find any issues, please ping me.

Thank you for reporting bug. Have a good weekend!

Yes. I think I am going to try to do that and see how it goes. Will keep you updated.

Thank you for finding this issue for us in BentoML @sayakpaul

@sayakpaul hi, I will check out your notebook and see what I can find out. I will ping you once I have more info