def load_modules(): """Load modules data to database.""" print "Modules" # delete modules before data gets added to avoid duplicate info # Module.query.delete() # insert data from seed_module with open("seed_data/seed_module") as module_data: for row in module_data: name, description, additional_info, user_id = row.rstrip().split( "|") module = Module(name=name, description=description, additional_info=additional_info, user_id=user_id) # add module to session db.session.add(module) # commit changes db.session.commit() print "Modules loaded."
def __init__(self, ip, port): self.ip = ip self.port = port self.buffsize = 2048**2 self.main_socket = s.socket(s.AF_INET, s.SOCK_STREAM) # Create socketTCP ### self.main_socket.setsockopt(s.SOL_SOCKET, s.SO_REUSEADDR, self.port) ### self.main_socket.bind((self.ip, self.port)) self.main_socket.listen(5) self.server_is_open = True # is the game server on? self.games = [] # list of on-going games self.names = {} # dictionary of client and his game name self.timers = {} # dictionary of each game and its time state (True/False) {Game:T/F} self.question_start_time = {} # dictionary of {Game:current_question_start_time} self.wall_displays = {} # dictionary of all wall display sockets and their games {socket:Game} self.next_question_request = [] # list of wall displays that requested a next question (for a specific round) self.validated = {} # dictionary of clients and their validated games self.validated_login = {} # dictionary of validated username and password clients, values: T/number of failed logins self.messages = {} # {socket:[messages],...} self.inputs = [self.main_socket] self.outputs = [] self.connected = {} # dictionary of connected clients (including clients not in games) {clientobj:(Game, player_name),...} self.m = Module() # module object for database interaction
def main(): opts = optparser.parse_args()[0] train_loader = Loader(opts.train) opts.vocab_len = len(train_loader._char_to_id) opts.pos_len = len(train_loader._pos_to_id) opts.max_pos_len = train_loader._pos_max_len opts.max_target_len = train_loader._char_max_len opts.use_cuda = opts.use_cuda == 1 opts.eval = opts.eval == 1 opts.data_size = train_loader.get_data_size() if not torch.cuda.is_available(): opts.use_cuda = False torch.manual_seed(opts.seed) np.random.seed(opts.seed) if not opts.eval: # weights for paddings, set to 0 loss_weights = torch.ones(opts.vocab_len) loss_weights[0] = 0 criterion = nn.NLLLoss(loss_weights, size_average=False) c2i, i2c, p2i, i2p = train_loader.get_mappings() dev_loader = Loader(opts.dev, c2i, i2c, p2i, i2p) if dev_loader._pos_max_len > opts.max_pos_len: opts.max_pos_len = dev_loader._pos_max_len model = Module(opts) if opts.model_path is not '': model = torch.load(opts.model_path) train_batcher = Batcher(opts.batch_size, train_loader.get_data(), opts.max_pos_len, opts.eval) dev_batcher = Batcher(decode_batch, dev_loader.get_data(), opts.max_pos_len, True) print model start_train(model, criterion, opts, train_batcher, dev_batcher) else: model = torch.load(opts.model_path) model.eval() #print model c2i, i2c, p2i, i2p = train_loader.get_mappings() test_loader = Loader(opts.test, c2i, i2c, p2i, i2p) if test_loader._pos_max_len > opts.max_pos_len: opts.max_pos_len = test_loader._pos_max_len test_batcher = Batcher(1, test_loader.get_data(), opts.max_pos_len, opts.eval) opts.data_size = test_loader.get_data_size() decode(model, opts, test_batcher, i2c, i2p)
def add_modules(username): """Add function/module information""" if not verify_user(username): return redirect("/login") mname = request.form.get("mname") mdesc = request.form.get("mdesc") maddinfo = request.form.get("maddinfo") fname = request.form.get("fname") fdesc = request.form.get("fdesc") faddinfo = request.form.get("faddinfo") samplecode = request.form.get("samplecode") output = request.form.get("output") if fname == "": flash("Please input a function name.") return redirect("/{}/addmodules".format(username)) # fetch user to get user_id user = User.query.filter_by(username=username).first() existing_mod = Module.query.filter( (Module.user_id == user.user_id) | (Module.user_id == 1), Module.name == mname).first() if mname == "": module = Module.query.filter_by(module_id=1).first() elif existing_mod: module = existing_mod else: module = Module(name=mname, description=mdesc, additional_info=maddinfo, user_id=user.user_id) db.session.add(module) db.session.commit() function = Function(name=fname, description=fdesc, additional_info=faddinfo, sample_code=samplecode, output=output, user_id=user.user_id, module_id=module.module_id) db.session.add(function) db.session.commit() flash("Your notes have been added.") return redirect("/{}/studynotes".format(username))
"mod_code": "cmpu1004", }, { "student_no": "c4444", "mod_code": "cmpu1005", }, ] for student in students: try: new_student = Student(**student) session.add(new_student) session.commit() except: pass for module in modules: try: new_module = Module(**module) session.add(new_module) session.commit() except: pass for entry in student_modules: try: new_entry = StudentModule(**entry) session.add(new_entry) session.commit() except: pass
def setUp(self): self.cycles = 100 self.m = Module()
User(name='Bob Tan', username='******', password= '******' ), # temp_pass User(name='Mr. GovTech', username='******', password= '******' ), # govtech_strong_password User(name='Admin', username='******', password= '******' ), # super_duper_whitehacks_strong_password Module(code='IS200', name='Software Foundations'), Module(code='IS103', name='Computational Thinking'), Module(code='IS101', name='Seminar on Information Systems'), Module(code='WRIT001', name='Academic Writing'), Lesson(module_code='IS200', name='Lesson 01'), Lesson(module_code='IS103', name='Lesson 01'), Lesson(module_code='IS101', name='Lesson 01'), Lesson(module_code='WRIT001', name='Lesson 01'), Document( lesson_id=1, name='Document 01', is_draft=False, content= 'Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum' ), Document(lesson_id=4,
def __getitem__(self, idx): return (self.x[idx], self.t[idx], idx) # ===================================================================================================== # batch_size = {'train': 32, 'valid': 32} dataloader = { phase: torch.utils.data.DataLoader(dataset=SrDataset(phase, dire, width, height), batch_size=batch_size[phase], shuffle=False) for phase in ['valid'] } use_gpu = torch.cuda.is_available() module = Module() module.load_state_dict(torch.load(pretrained)) fid = open('parameters', 'wb+') for param in module.parameters(): b = param.data.numpy() fid.write(b) fid.close() if use_gpu: module.cuda() module = nn.DataParallel(module, gpu) for stage in ([0] * 1): # for epoch in range(1): for phase in ["valid"]:
def setUp(self): self.m = Module()
def __init__(self, moduleList): # some definitions used during the bsv compilation process env = moduleList.env self.moduleList = moduleList self.hw_dir = env.Dir(moduleList.env['DEFS']['ROOT_DIR_HW']) self.TMP_BSC_DIR = env['DEFS']['TMP_BSC_DIR'] synth_modules = moduleList.synthBoundaries() self.USE_TREE_BUILD = moduleList.getAWBParam('wrapper_gen_tool', 'USE_BUILD_TREE') # all_module_dirs: a list of all module directories in the build tree self.all_module_dirs = [ self.hw_dir.Dir(moduleList.topModule.buildPath) ] for module in synth_modules: if (module.buildPath != moduleList.topModule.buildPath): self.all_module_dirs += [self.hw_dir.Dir(module.buildPath)] # all_build_dirs: the build (.bsc) sub-directory of all module directories self.all_build_dirs = [ d.Dir(self.TMP_BSC_DIR) for d in self.all_module_dirs ] # Include iface directories self.all_module_dirs += iface_tool.getIfaceIncludeDirs(moduleList) self.all_build_dirs += iface_tool.getIfaceLibDirs(moduleList) # Add the top level build directory self.all_build_dirs += [env.Dir(self.TMP_BSC_DIR)] self.all_module_dirs += [ self.hw_dir.Dir('include'), self.hw_dir.Dir('include/awb/provides') ] # Full search path: all module and build directories self.all_lib_dirs = self.all_module_dirs + self.all_build_dirs all_build_dir_paths = [d.path for d in self.all_build_dirs] self.ALL_BUILD_DIR_PATHS = ':'.join(all_build_dir_paths) all_lib_dir_paths = [d.path for d in self.all_lib_dirs] self.ALL_LIB_DIR_PATHS = ':'.join(all_lib_dir_paths) # we need to annotate the module list with the # bluespec-provided library files. Do so here. bsv_tool.decorateBluespecLibraryCode(moduleList) self.TMP_BSC_DIR = moduleList.env['DEFS']['TMP_BSC_DIR'] self.BUILD_LOGS_ONLY = moduleList.getAWBParam('bsv_tool', 'BUILD_LOGS_ONLY') self.USE_BVI = moduleList.getAWBParam('bsv_tool', 'USE_BVI') self.pipeline_debug = model.getBuildPipelineDebug(moduleList) # Should we be building in events? if (model.getEvents(moduleList) == 0): bsc_events_flag = ' -D HASIM_EVENTS_ENABLED=False ' else: bsc_events_flag = ' -D HASIM_EVENTS_ENABLED=True ' self.BSC_FLAGS = moduleList.getAWBParam('bsv_tool', 'BSC_FLAGS') + bsc_events_flag moduleList.env.VariantDir(self.TMP_BSC_DIR, '.', duplicate=0) moduleList.env['ENV']['BUILD_DIR'] = moduleList.env['DEFS'][ 'BUILD_DIR'] # need to set the builddir for synplify topo = moduleList.topologicalOrderSynth() topo.reverse() # Cleaning? Wipe out module temporary state. Do this before # the topo pop to ensure that we don't leave garbage around at # the top level. if moduleList.env.GetOption('clean'): for module in topo: MODULE_PATH = get_build_path(moduleList, module) os.system('cd ' + MODULE_PATH + '/' + self.TMP_BSC_DIR + '; rm -f *.ba *.c *.h *.sched *.log *.v *.bo *.str') topo.pop() # get rid of top module. ## Python module that generates a wrapper to connect the exposed ## wires of all synthesis boundaries. tree_builder = bsv_tool.BSVSynthTreeBuilder(self) ## ## Is this a normal build or a build in which only Bluespec dependence ## is computed? ## if not moduleList.isDependsBuild: ## ## Normal build. ## ## ## Now that the "depends-init" build is complete we can ## continue with accurate inter-Bluespec file dependence. ## This build only takes place for the first pass object ## code generation. If the first pass li graph exists, it ## subsumes awb-style synthesis boundary generation. ## for module in topo: self.build_synth_boundary(moduleList, module) ## We are going to have a whole bunch of BA and V files coming. ## We don't yet know what they contain, but we do know that there ## will be |synth_modules| - 2 of them if (not 'GEN_VERILOGS' in moduleList.topModule.moduleDependency): moduleList.topModule.moduleDependency['GEN_VERILOGS'] = [] if (not 'GEN_BAS' in moduleList.topModule.moduleDependency): moduleList.topModule.moduleDependency['GEN_BAS'] = [] ## Having described the new build tree dependencies we can build ## the top module. self.build_synth_boundary(moduleList, moduleList.topModule) ## Merge all synthesis boundaries using a tree? The tree reduces ## the number of connections merged in a single compilation, allowing ## us to support larger systems. if self.USE_TREE_BUILD: tree_builder.setupTreeBuild(moduleList, topo) ## ## Generate the global string table. Bluespec-generated global ## strings are stored in files by the compiler. ## ## The global string file will be generated in the top-level ## .bsc directory and a link to it will be added to the ## top-level directory. ## all_str_src = [] #for module in topo + [moduleList.topModule]: for module in moduleList.moduleList + topo + [ moduleList.topModule ]: if ('STR' in module.moduleDependency): all_str_src.extend(module.moduleDependency['STR']) if (self.BUILD_LOGS_ONLY == 0): bsc_str = moduleList.env.Command( self.TMP_BSC_DIR + '/' + moduleList.env['DEFS']['APM_NAME'] + '.str', all_str_src, ['cat $SOURCES > $TARGET']) strDep = moduleList.env.Command( moduleList.env['DEFS']['APM_NAME'] + '.str', bsc_str, [ 'ln -fs ' + self.TMP_BSC_DIR + '/`basename $TARGET` $TARGET' ]) moduleList.topDependency += [strDep] if moduleList.env.GetOption('clean'): print 'Cleaning depends-init...' s = os.system('scons --clean depends-init') else: ## ## Dependence build. The target of this build is "depens-init". No ## Bluespec modules will be compiled in this invocation of SCons. ## Only .depends-bsv files will be produced. ## # We need to calculate some dependencies for the build # tree. We could be clever and put this code somewhere # rather than replicate it. if self.USE_TREE_BUILD: buildTreeDeps = {} buildTreeDeps['GEN_VERILOGS'] = [] buildTreeDeps['GEN_BAS'] = [] #This is sort of a hack. buildTreeDeps['WRAPPER_BSHS'] = [ 'awb/provides/soft_services.bsh' ] buildTreeDeps['GIVEN_BSVS'] = [] buildTreeDeps['BA'] = [] buildTreeDeps['STR'] = [] buildTreeDeps['VERILOG'] = [] buildTreeDeps['BSV_LOG'] = [] buildTreeDeps['VERILOG_STUB'] = [] tree_module = Module( 'build_tree', ["mkBuildTree"], moduleList.topModule.buildPath,\ moduleList.topModule.name,\ [], moduleList.topModule.name, [], buildTreeDeps, platformModule=True) tree_module.dependsFile = '.depends-build-tree' moduleList.insertModule(tree_module) tree_file_bo = get_build_path( moduleList, moduleList.topModule) + "/build_tree.bsv" # sprinkle files to get dependencies right bo_handle = open(tree_file_bo, 'w') # mimic AWB/leap-configure bo_handle.write('//\n') bo_handle.write( '// Synthesized compilation file for module: build_tree\n') bo_handle.write('//\n') bo_handle.write('// This file was created by BSV.py\n') bo_handle.write('//\n') bo_handle.write('`define BUILDING_MODULE_build_tree\n') bo_handle.write('`include "build_tree_Wrapper.bsv"\n') bo_handle.close() # Calling generateWrapperStub will write out default _Wrapper.bsv # and _Log.bsv files for build tree. However, these files # may already exists, and, in the case of build_tree_Wrapper.bsv, # have meaningful content. Fortunately, generateWrapperStub # will not over write existing files. wrapper_gen_tool.generateWrapperStub(moduleList, tree_module) wrapper_gen_tool.generateAWBCompileWrapper( moduleList, tree_module) topo.append(tree_module) deps = [] useDerived = True first_pass_LI_graph = wrapper_gen_tool.getFirstPassLIGraph() if (not first_pass_LI_graph is None): useDerived = False # we also need to parse the platform_synth file in th platform_synth = get_build_path( moduleList, moduleList.topModule ) + "/" + moduleList.localPlatformName + "_platform_synth.bsv" platform_deps = ".depends-platform" deps += self.compute_dependence(moduleList, moduleList.topModule, useDerived, fileName=platform_deps, targetFiles=[platform_synth]) # If we have an LI graph, we need to construct and compile # several LI wrappers. do that here. # include all the dependencies in the graph in the wrapper. li_wrappers = [] tree_base_path = get_build_path(moduleList, moduleList.topModule) liGraph = LIGraph([]) firstPassGraph = first_pass_LI_graph # We should ignore the 'PLATFORM_MODULE' liGraph.mergeModules([ module for module in getUserModules(firstPassGraph) if module.getAttribute('RESYNTHESIZE') is None ]) for module in sorted(liGraph.graph.nodes(), key=lambda module: module.name): wrapper_import_path = tree_base_path + '/' + module.name + '_Wrapper.bsv' li_wrappers.append(module.name + '_Wrapper.bsv') wrapper_import_handle = open(wrapper_import_path, 'w') wrapper_import_handle.write('import Vector::*;\n') wrapper_gen_tool.generateWellKnownIncludes( wrapper_import_handle) wrapper_gen_tool.generateBAImport(module, wrapper_import_handle) wrapper_import_handle.close() platform_deps = ".depends-" + module.name deps += self.compute_dependence( moduleList, moduleList.topModule, useDerived, fileName=platform_deps, targetFiles=[wrapper_import_path]) for module in topo + [moduleList.topModule]: # for object import builds no Wrapper code will be included. remove it. deps += self.compute_dependence(moduleList, module, useDerived, fileName=module.dependsFile) moduleList.topDependsInit += deps