Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

oryondark/ServerlessMLInferenceLIB

Open more actions menu

Repository files navigation

Optimization of Serverless Platform with Theano

Current Version

  1. Torch to Theano Converter
  2. no implementation for Tensorflow and other framework

Converter Flow architecture

ConverterFlow

Example

  1. Converter
from ml_inference.modeling import *
import numpy as np
import torch
hooking_dummy = torch.Tensor(np.random.rand(3,64,64))
weight_parser(dnn_model, 'torch', hooking_dummy)

2.restore model.

from ml_inference.modeling import *
model = NeuralNet('weights.h5')

Summary

  1. The light package using theano with Scikit-Learn can upload to AWS Lambda.
  2. It is slow than Pytorch as Theano need not setup G++ environment.
  3. This library can't support Tensorflow or MXNet.

Contributor

Hyunjune Kim - email is '4u_olion@naver.com' , You can call me Jey!
Kyungyong Lee - my professor is him, an assistant professor in KOOKMIN University.

Bigdata Lab in Kookmin University

About

It is Integrated Machine Learning Inference library for AWS Lambda

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

Morty Proxy This is a proxified and sanitized view of the page, visit original site.