def caffe2rknn(caffe_proto,caffe_weight,rknn_model): print("start export") rknn=RKNN(verbose=True) ret=rknn.load_caffe(model=caffe_proto, proto="caffe", blobs=caffe_weight) rknn.config(channel_mean_value='127.5 127.5 127.5 128.0', reorder_channel='2 1 0', #reorder_channel='0 1 2', #need_horizontal_merge=True ) ret = rknn.build(do_quantization=False) #ret = rknn.build(do_quantization=True) ret=rknn.export_rknn(export_path=rknn_model) print("export finished")
exit(-1) # Create RKNN object rknn = RKNN(verbose=False) # Set model config print('--> Config model') rknn.config(mean_values=[[103.94, 116.78, 123.68]], std_values=[[1, 1, 1]], reorder_channel='2 1 0') print('done') # Load caffe model print('--> Loading model') ret = rknn.load_caffe( model='./deploy_rm_detection_output.prototxt', proto='caffe', blobs='./VGG_VOC0712_SSD_300x300_iter_120000.caffemodel') if ret != 0: print('Load model failed! Ret = {}'.format(ret)) exit(ret) print('done') # Build model print('--> Building model') ret = rknn.build(do_quantization=True, dataset='./dataset.txt') if ret != 0: print('Build model failed!') exit(ret) print('done') # Export RKNN model
if __name__ == '__main__': # Create RKNN object rknn = RKNN() # pre-process config print('--> config model') rknn.config(mean_values=[[0, 0, 0]], std_values=[[1, 1, 1]], reorder_channel='2 1 0') print('done') # Load tensorflow model print('--> Loading model') ret = rknn.load_caffe(model='./deploy.prototxt', proto='caffe', blobs='./solver_iter_45.caffemodel') if ret != 0: print('Load interp_test failed! Ret = {}'.format(ret)) exit(ret) print('done') # Build model print('--> Building model') ret = rknn.build(do_quantization=True, dataset='./dataset.txt') if ret != 0: print('Build interp_test failed!') exit(ret) print('done') # Export rknn model
if __name__ == '__main__': # Create RKNN object rknn = RKNN() # pre-process config print('--> config model') rknn.config(channel_mean_value='127.5 127.5 127.5 128', reorder_channel='2 1 0', quantized_dtype='dynamic_fixed_point-8') print('done') # Load tensorflow model print('--> Loading model') ret = rknn.load_caffe(model='./ONet.prototxt', proto='caffe', blobs='./ONet.caffemodel') if ret != 0: print('Load model failed!') exit(ret) print('done') # Build model print('--> Building model') ret = rknn.build(do_quantization=True, dataset='./dataset_onet.txt') if ret != 0: print('Build model failed!') exit(ret) print('done') # Export rknn model
if __name__ == '__main__': # Create RKNN object rknn = RKNN() # pre-process config print('--> config model') rknn.config(channel_mean_value='103.94 116.78 123.68 58.82', reorder_channel='2 1 0') print('done') # Load tensorflow model print('--> Loading model') ret = rknn.load_caffe(model='./mobilenet_v2.prototxt', proto='caffe', blobs='./mobilenet_v2.caffemodel') if ret != 0: print('Load mobilenet_v2 failed! Ret = {}'.format(ret)) exit(ret) print('done') # Build model print('--> Building model') ret = rknn.build(do_quantization=True, dataset='./dataset.txt') if ret != 0: print('Build mobilenet_v2 failed!') exit(ret) print('done') # Export rknn model
def convert_model(model_path, out_path, pre_compile): if os.path.isfile(model_path): yaml_config_file = model_path model_path = os.path.dirname(yaml_config_file) else: yaml_config_file = os.path.join(model_path, 'model_config.yml') if not os.path.exists(yaml_config_file): print('model config % not exist!' % yaml_config_file) exit(-1) model_configs = parse_model_config(yaml_config_file) exported_rknn_model_path_list = [] for model_name in model_configs['models']: model = model_configs['models'][model_name] rknn = RKNN() rknn.config(**model['configs']) print('--> Loading model...') if model['platform'] == 'tensorflow': model_file_path = os.path.join(model_path, model['model_file_path']) input_size_list = [] for input_size_str in model['subgraphs']['input-size-list']: input_size = list(map(int, input_size_str.split(','))) input_size_list.append(input_size) pass rknn.load_tensorflow(tf_pb=model_file_path, inputs=model['subgraphs']['inputs'], outputs=model['subgraphs']['outputs'], input_size_list=input_size_list) elif model['platform'] == 'tflite': model_file_path = os.path.join(model_path, model['model_file_path']) rknn.load_tflite(model=model_file_path) elif model['platform'] == 'caffe': prototxt_file_path = os.path.join(model_path, model['prototxt_file_path']) caffemodel_file_path = os.path.join(model_path, model['caffemodel_file_path']) rknn.load_caffe(model=prototxt_file_path, proto='caffe', blobs=caffemodel_file_path) elif model['platform'] == 'onnx': model_file_path = os.path.join(model_path, model['model_file_path']) rknn.load_onnx(model=model_file_path) else: print("platform %s not support!" % (model['platform'])) print('done') if model['quantize']: dataset_path = os.path.join(model_path, model['dataset']) else: dataset_path = './dataset' print('--> Build RKNN model...') rknn.build(do_quantization=model['quantize'], dataset=dataset_path, pre_compile=pre_compile) print('done') export_rknn_model_path = "%s.rknn" % (os.path.join( out_path, model_name)) print('--> Export RKNN model to: {}'.format(export_rknn_model_path)) rknn.export_rknn(export_path=export_rknn_model_path) exported_rknn_model_path_list.append(export_rknn_model_path) print('done') return exported_rknn_model_path_list
from rknn.api import RKNN # create RKNN object rknn = RKNN(verbose=True) # Load caffe model print('--> Loading model') ret = rknn.load_caffe(model='./model/new_model/openpose.prototxt', proto='caffe', blobs='./model/new_model/openpose.caffemodel') # print error message if ret !=0: print('Load failed!') exit(ret) print('done') # Build model print('--> Building model') ret = rknn.build(do_quantization=False) # print error message if ret !=0: print('Build failed!') exit(ret) print('done') # Export rknn model print('--> Export RKNN model') ret = rknn.export_rknn('./openpose_caffe.rknn') # print error message
from rknn.api import RKNN if __name__ == '__main__': rknn = RKNN() #rknn.config(channel_mean_value='0 0 0 1', reorder_channel='0 1 2',quantized_dtype='dynamic_fixed_point-8') rknn.config(channel_mean_value='0 0 0 1', reorder_channel='0 1 2', quantized_dtype='dynamic_fixed_point-16', batch_size=2) print('--> Loading model') rknn.load_caffe(model='./enet_deploy_final.prototxt', proto='caffe', blobs='./cityscapes_weights.caffemodel') print('done') print('--> Building model') rknn.build(do_quantization=True, dataset='./dateset.txt') #rknn.build(do_quantization=False) print('done') # 导出保存rknn模型文件 rknn.export_rknn('./cityscapes.rknn') # Release RKNN Context rknn.release()