def index(): users = read("SELECT * FROM [dbo].[User]") roles = read("SELECT * FROM [dbo].[Role]") privileges = read("SELECT * FROM [dbo].[Privilege]") access = read("SELECT * FROM [dbo].[Access]") tables = read("SELECT * FROM [dbo].[Table]") return render_template("index.html", users=users, roles=roles, privileges=privileges, access=access, tables=tables)
def fetch_tags_problems(): sql = "SELECT DISTINCT(pid) from ptag" res = db.read(sql, cursor) problem_list = [] for i in res: problem = str(i[0]) if problem not in problem_list: problem_list.append(problem) sql = "SELECT tag from tag" a = db.read(sql, cursor) for i in a: url = "http://codeforces.com/api/problemset.problems" tag = str(i[0].encode('utf8')) payload = {'tags':tag} r = requests.get(url, params=payload) if(r.status_code != 200 ): print r.status_code, " returned from ", r.url continue else: res = r.json() problems = res['result']['problems'] problemStatistics = res['result']['problemStatistics'] sql = "" for j in problems: code = str(j['contestId'])+str(j['index']) code = "cfs"+code if code not in problem_list: sql +="(\'"+code+"\', \'"+tag+"\'), " if(sql!=""): print sql sql = sql[:-2] sql = "INSERT INTO ptag (pid, tag) VALUES " + sql print sql result = db.write(sql, cursor, conn)
def index_handler(): table_results = db.read('SELECT * FROM lists') cards_results = db.read('SELECT * FROM cards') res = [] for i in table_results: cards = [] for j in cards_results: if i['id'] == j['listId']: cards.append(j) item = {'id': i['id'], 'title': i['name'], 'cards': cards} res.append(item) return response(200, json.dumps(res))
def update(event, context): log.info('PUT to update') db_connection = db.get_db_connection() key = event['pathParameters']['id'] value = json.loads(event['body']) if 'data' not in value: raise Exception('Missing parameter: data') if not db.read(db_connection, key): response = {'statusCode': 404, 'body': {'error': 'Not found'}} elif db.write(db_connection, key, value['data']): read = db.read(db_connection, key) response = {'statusCode': 200, 'body': read} return response
def fetch_user_activity_erd(uid="", handle=""): ''' | Fetch User's activity from Erdos ''' url = "http://erdos.sdslabs.co/activity/users/" + handle[3:] + ".json" print url sql = "SELECT MAX(created_at) FROM activity WHERE handle = \'" + handle + "\'" print sql res = db.read(sql, cursor) # print res if res[0][0] is None: last_activity = 0 else: last_activity = int(res[0][0]) # last_activity = int(last_activity) payload = {} payload['start_time'] = last_activity r = requests.get(url, params = payload) if(r.status_code != 200 ): print r.status_code, " returned from ", r.url else: result = r.json()['list'] result.reverse() for act in result: if int(act['created_at']) > last_activity: sql = "SELECT pid FROM activity WHERE pid = \'erd" + act['problem_id'] + "\' AND handle = \'" + handle + "\'" check = db.read(sql, cursor) difficulty = 0 if check == (): sql = "INSERT INTO activity (handle, pid, attempt_count, status, difficulty, uid, created_at) VALUES ( \'" + handle + "\', \'erd" + act['problem_id'] + "\', '1', " + str(act['status']) + ", " + str(difficulty) + ", " + uid + ", " + str(act['created_at']) + " )" db.write(sql, cursor, conn) p = problem("erd" + act['problem_id']) if p.exists_in_db != -1: tag_data = p.tag for tag in tag_data: sql = "SELECT tag FROM user_tag_score WHERE tag = \'" + tag + "\' AND handle = \'" + handle + "\'" tag_check = db.read(sql, cursor) if tag_check == (): sql = "INSERT INTO user_tag_score (handle, tag, score) VALUES ( \'" + handle + "\' , \'" + tag + "\' , " + str(tag_data[tag]) + " )" db.write(sql, cursor, conn) else: sql = "UPDATE user_tag_score SET score = score +" + str(tag_data[tag]) + " WHERE tag = \'" + tag + "\' AND handle = \'" + handle + "\'" db.write(sql, cursor, conn) else: sql = "UPDATE activity SET attempt_count = attempt_count + 1, status = " + str(act['status']) + ", difficulty = " + str(difficulty) + ", created_at = " + str(act['created_at']) + " WHERE pid = \'erd" + act['problem_id'] + "\' AND handle = \'" + handle + "\'" db.write(sql, cursor, conn) print sql
def fetch_user_activity_cfs(handle=""): ''' | Fetch User's activity from Codeforces ''' cfs_url = "http://codeforces.com/api/user.status" payload = {} payload['handle'] = handle handle = 'cfs' + handle sql = "SELECT created_at FROM activity WHERE handle = \'" + handle + "\' ORDER BY created_at DESC LIMIT 1;" res = db.read(sql, cursor) if res == (): last_activity = 0 else: last_activity = res[0][0] last_activity = int(last_activity) r = requests.get(cfs_url, params=payload) if(r.status_code != 200 ): print r.status_code, " returned from ", r.url else: result = r.json()['result'] result.reverse() for act in result: if int(act['creationTimeSeconds']) > last_activity: sql = "SELECT pid FROM activity WHERE pid = \'cfs" + str(act['problem']['contestId']) + str(act['problem']['index']) + "\' AND handle = \'" + handle + "\'" check = db.read(sql, cursor) difficulty = 0 if act['verdict'] == "OK": status = 1 else: status = 0 if check == (): sql = "INSERT INTO activity (handle, pid, attempt_count, status, difficulty, created_at) VALUES ( \'" + handle + "\', \'cfs" + str(act['problem']['contestId']) + str(act['problem']['index']) + "\', '1', " + str(status) + ", " + str(difficulty) + ", " + str(act['creationTimeSeconds']) +" )" db.write(sql, cursor, conn) p = problem("cfs" + str(act['problem']['contestId']) + str(act['problem']['index'])) if p.exists_in_db != -1: tag_data = p.tag for tag in tag_data: sql = "SELECT tag FROM user_tag_score WHERE tag = \'" + tag + "\' AND handle = \'" + handle + "\'" tag_check = db.read(sql, cursor) if tag_check == (): sql = "INSERT INTO user_tag_score (handle, tag, score) VALUES ( \'" + handle + "\' , \'" + tag + "\' , " + str(tag_data[tag]) + " )" db.write(sql, cursor, conn) else: sql = "UPDATE user_tag_score SET score = score + " + str(tag_data[tag]) + " WHERE tag = \'" + tag + "\' AND handle = \'" + handle + "\'" db.write(sql, cursor, conn) else: sql = "UPDATE activity SET attempt_count = attempt_count + 1, status = " + str(status) + ", difficulty = " + str(difficulty) + ", created_at = " + str(act['creationTimeSeconds']) + " WHERE pid = \'cfs" + str(act['problem']['contestId']) + str(act['problem']['index']) + "\' AND handle = \'" + handle + "\'" db.write(sql, cursor, conn)
def fetch_users(): sql_user = "******" result = db.read(sql_user, cursor) user_list = {} #using dict for faster lookup for i in result: user_list[i[0]]=0 new_user = [] done = 0 total = 1 page = 1 while ( done < total ): user_url = "http://erdos.sdslabs.co/users.json?page=" + str(page) user_r = requests.get(user_url) if(user_r.status_code != 200 ): print user_r.status_code, " returned from ", user_url else: total = user_r.json()['TOTAL'] done += 500 user_res = user_r.json()['list'] for i in user_res: if(i['username'] not in user_list): new_user.append(i['username']) page += 1 if new_user: # print len(new_user) sql = "INSERT INTO user (erd_handle, cfs_handle) VALUES " for i in new_user: sql+="(\'erd"+str(i)+"\',\'cfs\'), " sql=sql[:-2] print sql db.write(sql, cursor, conn) db.write(sql, remote_cursor, remote_conn)
def recommend_problems_from_tag(self, mode): ''' | Create list of recommended problems by taking weighted sum of similar users' ratings ''' tags = self.reco_algo(mode) tags = dict(tags) sql = "SELECT * FROM ptag WHERE pid NOT IN (SELECT pid FROM activity WHERE uid = \'"+str(self.uid)+"\' AND status = 1)" result = db.read(sql, self.cursor) plist = defaultdict(list) plist_score = {} for i in result: plist[i[0]].append(i[1]) # print tags # print plist if len(tags) > 0: for i in plist: a = len(plist[i]) plist_score[i] = 0 for j in plist[i]: if j in tags: plist_score[i] += tags[j]/a elif j in self.difficulty_matrix[self.erd_handle]: plist_score[i] += self.difficulty_matrix[self.erd_handle][j]/a elif j in self.difficulty_matrix[self.cfs_handle]: plist_score[i] += self.difficulty_matrix[self.cfs_handle][j]/a plist_score = sorted(plist_score.items(), key=operator.itemgetter(1), reverse = 1) # print plist_score self.log_results_db(plist_score)
def evaluate_user_ranking_accuracy(self): ''' | Validate the calculated user ranking against codeforces ranks ''' sql = "SELECT * FROM user WHERE cfs_handle != 'cfs' AND cfs_handle != ''" user_list = db.read(sql, self.cursor) score = 0 for user in user_list: handle = user[2] rating = int( user[4] ) # print handle pred_sim_users = self.find_similar_users( handle ) act_sim_users = {} for i in user_list: act_sim_users[i[2]] = abs( rating - int( i[4] ) ) # print pred_sim_users # print act_sim_users.keys() common = set( pred_sim_users.keys() ) & set( act_sim_users.keys() ) pred = list( pred_sim_users[c] for c in common ) act = list( act_sim_users[c] for c in common ) # print common # break sim = self.cosine_similarity( pred, act ) if sim != 0: print "accuracy for user: "******" = " + str( sim ) score += sim score = score / len( user_list ) return score
def increment_problem(): '''function to fetch newly added problems''' sql = "SELECT pid FROM `problem` WHERE MID(pid, 1, 3) = \"cfs\"" a = db.read(sql, cursor) problem_list = [] for i in a: pid = str(i[0].encode('utf8')) problem_list.append(pid) pageno = 1 while (increment_problem_from_page(pageno, problem_list) == 0): pageno += 1 if (len(new_problem) > 0): sql = "INSERT INTO problem (pid, name, time) VALUES " for i in new_problem.keys(): j = new_problem[i] a = str(j[0]).replace('"', '\\"') a = a.replace("'", "\\'") b = j[1].encode('utf8') b = str(b).replace('"', '\\"') b = b.replace("'", "\\'") sql += "('" + str(a) + "','" + str(b) + "','" + str(int( time())) + "'), " sql = sql[:-2] result = db.write(sql, cursor, conn)
def read(self): """Read the database in memory.""" for db in self.dbs: result = db.read() if result != self.OK: return result return self.OK
def setdefaultprogram(self): sql = """SELECT program_id, code, name, relative_times, repeat_program, default_program, selected FROM programs WHERE module = {0} AND (default_program = 1 OR selected = 1) ORDER BY selected DESC, default_program DESC""".format(db.dbstr(self.name())) programs = db.read(sql, 0, 1) if not programs: self.program = {} else: self.program = { ProgramCodes.PROGRAM_ID:programs[0]['program_id'], ProgramCodes.CODE:programs[0]['code'], ProgramCodes.NAME:programs[0]['name'], ProgramCodes.RELATIVE_TIMES:programs[0]['relative_times'], ProgramCodes.REPEAT_PROGRAM:programs[0]['repeat_program'], ProgramCodes.DEFAULT_PROGRAM:programs[0]['default_program'], ProgramCodes.SELECTED:programs[0]['selected'], ProgramCodes.START_TIME:datetime.datetime.now(), ProgramCodes.MESSAGE:"" } self.onprogramchanged() return True
def setprogram(self, program, message = ""): sql = """SELECT program_id, code, name, relative_times, repeat_program, default_program, selected FROM programs WHERE module = {0} AND code = {1}""".format(db.dbstr(self.name()), db.dbstr(program)) programs = db.read(sql, 0, 1) self.program = { ProgramCodes.PROGRAM_ID:programs[0]['program_id'], ProgramCodes.CODE:programs[0]['code'], ProgramCodes.NAME:programs[0]['name'], ProgramCodes.RELATIVE_TIMES:programs[0]['relative_times'], ProgramCodes.REPEAT_PROGRAM:programs[0]['repeat_program'], ProgramCodes.DEFAULT_PROGRAM:programs[0]['default_program'], ProgramCodes.SELECTED:programs[0]['selected'], ProgramCodes.START_TIME:datetime.datetime.now(), ProgramCodes.MESSAGE:message } if self.program[ProgramCodes.REPEAT_PROGRAM]: self.debug("Setting program {0} as selected program".format(self.program[ProgramCodes.CODE])) db.write("UPDATE programs SET selected = 0 WHERE module = {0} AND selected = 1".format(db.dbstr(self.name()))) db.write("UPDATE programs SET selected = 1 WHERE program_id = {0}".format(self.program[ProgramCodes.PROGRAM_ID])) self.onprogramchanged() return True
def fetch_users(): sql_user = "******" result = db.read(sql_user, cursor) user_list = {} #using dict for faster lookup for i in result: user_list[i[0]] = 0 new_user = [] done = 0 total = 1 page = 1 while (done < total): user_url = base_url + "/users.json?page=" + str(page) user_r = requests.get(user_url) if (user_r.status_code != 200): print user_r.status_code, " returned from ", user_url else: total = user_r.json()['TOTAL'] done += 500 user_res = user_r.json()['list'] for i in user_res: if (i['username'] not in user_list): new_user.append(i['username']) page += 1 if new_user: # print len(new_user) sql = "INSERT INTO user (erd_handle, cfs_handle) VALUES " for i in new_user: sql += "(\'erd" + str(i) + "\',\'cfs\'), " sql = sql[:-2] print sql db.write(sql, cursor, conn) db.write(sql, remote_cursor, remote_conn)
def test_db(self): data_uuid = uuid.uuid4().hex data = { 'name': '', 'branches': [ (1, 2), (2, 3), (3, 4), (4, 1), (2, 5), (1, 5) ], 'nodes': [1, 2, 3, 4, 5], 'uuid': data_uuid } db.insert(data) db_data = db.read(data_uuid) expected = [{ 'name': '', 'branches': [ [1, 2], [2, 3], [3, 4], [4, 1], [2, 5], [1, 5] ], 'nodes': [1, 2, 3, 4, 5], 'uuid': data_uuid }] self.assertListEqual(db_data, expected)
def increment_tags(): '''Function to add new tags to db''' fetch_all_tags() new_tags = [] tag_list = [] sql = "SELECT tag from tag" a = db.read(sql, cursor) for i in a: # print i tag = str(i[0].encode('utf8')) tag_list.append(tag) for i in tags.keys(): if tags[i] not in tag_list: new_tags.append(i) if(len(new_tags)>0): sql = "INSERT INTO tag (tag, description, time) VALUES " for i in new_tags: a = str(i).replace('"','\\"') a = a.replace("'","\\'") b = str(tags[i]).replace('"','\\"') b = str(tags[i]).replace("'","\\'") sql+="('" + str(b) + "','" + str(a) + "','" + str(int(time())) + "'), " sql = sql[:-2] result = db.write(sql, cursor, conn)
def _task_create_kwargs(arguments): result = AttrDict( name=arguments['<name>'], state='pending', command=arguments['--command'], condition=arguments['--condition'], last=str(datetime.now()), ) if arguments['--once']: result.last += '<once>' if arguments['--inherit']: parent = db.read(table='tasks', name=arguments['--inherit']) result.update(dict( parent=parent.name, schedule='<inherit>', command=result.command if result.command else '<inherit>', email='<inherit>', condition=result.condition if result.condition else '<inherit>', )) result.update(_task_sched_kwargs(arguments)) return result
def get_thread(thread_id, messages = False): query = "SELECT * FROM message_threads WHERE thread_id = %s LIMIT 1" % thread_id result = db.read_one(query) if messages: query = "SELECT * FROM messages WHERE thread_id = %s ORDER BY t_create DESC" % thread_id result['messages'] = db.read(query) return result
def update_user(email): global current_user try: user = db.read(email) if user: current_user = user except: print("something went wrong")
def check(self): '''Checks done during submit phase. Protected elements are written.''' # controllare che session sia FastSim o FullSim if self.soft_version == '' and self.session == '': self.__sbcurrent() allowed_session = ['FastSim', 'FullSim'] if self.session not in allowed_session: raise ApplicationConfigurationError( None, 'session must be %s' % allowed_session) if self.number_of_subjobs < 1 or self.number_of_subjobs > 250: raise ApplicationConfigurationError( None, 'number_of_subjobs must be between 1 and 250') sql = '''SELECT DISTINCT soft_version FROM session_site_soft WHERE session_name = %s ORDER BY soft_version''' supported_soft_version = db.read(sql, (self.session, )) # convert from list of dict to list of strings supported_soft_version = [ s['soft_version'] for s in supported_soft_version ] if self.soft_version not in supported_soft_version: raise ApplicationConfigurationError( None, 'supported soft_version are: %s' % supported_soft_version) if self.session == 'FastSim' and self.background_frame == True: results = db.read('''SELECT prod_series, lfn_dir FROM background_frame WHERE valid = true ORDER BY validation_timestamp DESC LIMIT 1''') self.input_path = list() self.input_path.append(results[0]['lfn_dir']) self.input_mode = 'dir' self.background_frame_prod_series = results[0]['prod_series'] logger.info('Last approved (%s) background frame has been setup \ for job input' % self.background_frame_prod_series)
def getParent(dataset): if 'parent' in dataset['parameters']: sql = 'SELECT * FROM dataset_union WHERE dataset_id = %s' parent = db.read(sql, (dataset['parameters']['parent'], )) assert len(parent) == 1, 'Must be only one parent.' getParent(parent[0]) print('') self.printDatasetDetail(parent[0])
def get_details(self): sql = (""" SELECT r.id, r.name FROM recipient AS r JOIN event AS e WHERE r.event_id = e.id AND e.id = %s """) data = db.read(sql, params=(self.event_id, )) self.recipients = data
def read(): try: x = db.read() print("\n All data from collection \n") for emp in x: print(emp) except Exception as e: print(str(e))
def get_uid(self): ''' | Get uid from database using his erdos handle and also set cfs_handle if provided ''' sql = "SELECT uid, cfs_handle FROM user WHERE erd_handle = \'" + self.erd_handle + "\'" res = db.read(sql, self.cursor) if not res: sql = "INSERT INTO user (erd_handle, cfs_handle, erd_score, cfs_score) VALUES ( \'" + self.erd_handle + "\', \'" + self.cfs_handle + "\', '0', '0')" db.write(sql, self.cursor, self.conn) sql = "SELECT uid FROM user WHERE erd_handle = \'" + self.erd_handle + "\'" uid = db.read(sql, self.cursor) self.fill_activity(uid[0][0]) return uid[0][0] else: if self.cfs_handle != res[0][1] : sql = "UPDATE user SET cfs_handle = \'" + self.cfs_handle + "\'" db.write(sql, self.cursor, self.conn) return res[0][0]
def getDefaultRunSite(self): '''This method is called only if backend is LCG. It retrieves the sites associated to dataset_id.''' sql = 'SELECT site FROM dataset_site_union WHERE dataset_id = %s' sites = db.read(sql, (r'\x' + self.dataset_id, )) for site in sites: self.run_site.append(site['site'])
def getDataset(self, **kwargs): '''Get all metadata of all datasets. Public method, not exported to GPI.''' db_view_column = ['dataset_id', 'creation_date', 'occupancy'] sql = 'SELECT * FROM dataset_union WHERE true' kwargs['owner'] = kwargs.get('owner', ['official', utils.getOwner()]) # add filter to query if len(kwargs) > 0: for key, value in kwargs.iteritems(): if key in db_view_column: sql += " AND %s ILIKE '%s%%'" % (key, value) elif key == 'files': sql += " AND files > %s" % value elif key in ['status', 'session', 'owner']: if not isinstance(value, list): value = [value] sql += " AND (false" for s in value: sql += " OR %s ILIKE '%s%%'" % (key, s) sql += ")" else: sql += " AND parameters->'%s' ILIKE '%s%%'" % (key, value) # clean up the query sql = sql.replace('false OR ', '') sql = sql.replace('true AND ', '') # TODO: add control to prevent sql injection datasets = db.read(sql) if len(datasets) == 0: raise GangaException('No dataset found') i = 0 for dataset in datasets: dataset['id'] = i i += 1 dataset['occupancy_human'] = utils.sizeof_fmt_binary( dataset['occupancy']) if 'evt_file' in dataset[ 'parameters'] and not 'evt_tot' in dataset['parameters']: evt_file = int(dataset['parameters']['evt_file']) if dataset['files'] is None: dataset['files'] = 0 files = int(dataset['files']) dataset['parameters']['evt_tot'] = evt_file * files if 'evt_tot' in dataset['parameters']: dataset['parameters'][ 'evt_tot_human'] = utils.sizeof_fmt_decimal( int(dataset['parameters']['evt_tot'])) return datasets
def get_table(): items = db.read() shows = [] for row in items: hosts = ', '.join([i for i in host_calc(row)]) shows.append(Item(row[0], row[1], row[2], row[3], hosts)) table = ItemTable(shows) return table
def _decendant_filter(task: Task, ancestor: str) -> bool: parent = task.parent while parent != "": if parent == ancestor: return True parent = db.read(table='tasks', name=parent).parent return False
def retrieve(event, context): log.info('GET to retrieve') key = event['pathParameters']['id'] db_connection = db.get_db_connection() value = db.read(db_connection, key) if value is not None: return {'statusCode': 200, 'body': value} else: return {'statusCode': 404, 'body': {'error': 'Not found'}}
def _load_recipients_from_recipient_ids(self, recipients): sql = (""" SELECT r.peer_id FROM recipient r WHERE r.id IN %s """) params = [recipient['id'] for recipient in recipients] self.recipients = db.read(sql, params=(params, )) LOGGER.info('loaded recipients: {}'.format(self.recipients))
def update_tag_count(): '''Function to update count of problems for each tag in the db''' sql = "SELECT tag FROM tag" a = db.read(sql, cursor) # sql = "" for i in a: tag = str(i[0].encode('utf8')) sql = "UPDATE tag SET count = (SELECT COUNT(*) FROM ptag WHERE tag = \'" + str(tag) + "\' ) WHERE tag = \'" + tag + "\'" #todo - optimise this sql # print sql result = db.write(sql, cursor, conn)
def get_privilege_role(): name = request.form['Role'] data = read("SELECT * FROM [Privilege] WHERE PID IN " "(SELECT PID FROM [Access] WHERE RID IN " "(SELECT RID FROM [Role] WHERE RName='" + str(name) + "'))") if data: return jsonify({'Records': data, 'Status_Code': 200}) return jsonify({'Status_Code': 500, 'Message': 'Internal Error.'})
def init(): global SETTINGS_MUSIC global SETTINGS_SOUND global SETTINGS_DIFFICULTY # Loads settings setting_music = db.read('setting', 'music') setting_sound = db.read('setting', 'sound') setting_difficulty = db.read('setting', 'difficulty') # Apply settings if(setting_music == False): SETTINGS_MUSIC = False if(setting_sound == False): SETTINGS_SOUND = False if(type(setting_difficulty) is int): SETTINGS_DIFFICULTY = setting_difficulty return
def _load_recipients_list_from_db(self): """ Load the recipients associated with the event id from db """ sql = (""" SELECT r.id, r.name, r.phone, r.peer_id FROM recipient r WHERE r.event_id = %s """) self.recipients = db.read(sql, params=(self.event_id, )) LOGGER.info('Loaded recipients from db: {}'.format(self.recipients))
def create_difficulty_matrix(self): self.difficulty_matrix = {} conn = db.connect() cursor=conn.cursor() sql = "SELECT * FROM activity" result = db.read(sql, cursor) for i in result: user = str(i[0].encode('utf8')) problem = str(i[1].encode('utf8')) if not self.difficulty_matrix.has_key(problem): self.difficulty_matrix[problem] = {} self.difficulty_matrix[problem][user] = i[4]
def __getLFNs(self): '''This method returns a list of LFNs from sbk database for the selected dataset_id. The number of LFNs is provided by user.''' sql = 'SELECT lfn, size FROM output_union WHERE dataset_id = %s' param = [r'\x' + self.dataset_id] if self.files_total > 0: sql += ' ORDER BY RANDOM() LIMIT %s' param.append(self.files_total) return db.read(sql, param)
def test_delete(self): event = { 'pathParameters': { 'id': 'ten' }, 'body': json.dumps({'data': 10}) } context = {} create(event, context) del event['body'] result = delete(event, context) self.assertIsNone(db.read(self.table, 'ten'))
def getDataset(self, **kwargs): '''Get all metadata of all datasets. Public method, not exported to GPI.''' db_view_column = ['dataset_id', 'creation_date', 'occupancy'] sql = 'SELECT * FROM dataset_union WHERE true' kwargs['owner'] = kwargs.get('owner', ['official', utils.getOwner()]) # add filter to query if len(kwargs) > 0: for key, value in kwargs.iteritems(): if key in db_view_column: sql += " AND %s ILIKE '%s%%'" % (key, value) elif key == 'files': sql += " AND files > %s" % value elif key in ['status', 'session', 'owner']: if not isinstance(value, list): value = [value] sql += " AND (false" for s in value: sql += " OR %s ILIKE '%s%%'" % (key, s) sql += ")" else: sql += " AND parameters->'%s' ILIKE '%s%%'" % (key, value) # clean up the query sql = sql.replace('false OR ', '') sql = sql.replace('true AND ', '') # TODO: add control to prevent sql injection datasets = db.read(sql) if len(datasets) == 0: raise GangaException('No dataset found') i = 0 for dataset in datasets: dataset['id'] = i i += 1 dataset['occupancy_human'] = utils.sizeof_fmt_binary(dataset['occupancy']) if 'evt_file' in dataset['parameters'] and not 'evt_tot' in dataset['parameters']: evt_file = int(dataset['parameters']['evt_file']) if dataset['files'] is None: dataset['files'] = 0 files = int(dataset['files']) dataset['parameters']['evt_tot'] = evt_file * files if 'evt_tot' in dataset['parameters']: dataset['parameters']['evt_tot_human'] = utils.sizeof_fmt_decimal(int(dataset['parameters']['evt_tot'])) return datasets
def processbroadcastrequest(self, request): sql = """SELECT program_id FROM programs WHERE module = {0} AND code = {1}""".format(db.dbstr(self.name()), db.dbstr(request[MessageCodes.VALUE])) if db.read(sql, 0, 1): if self.program[ProgramCodes.REPEAT_PROGRAM] == 0: self.programStack.append(self.program) self.setprogram(request[MessageCodes.VALUE], request[MessageCodes.MESSAGE]) self.loginformation("Program request", "Program {0} requested by {1}.".format(request[MessageCodes.VALUE], request[MessageCodes.WORKER])) return True;
def check(self): '''Checks done during submit phase. Protected elements are written.''' # controllare che session sia FastSim o FullSim if self.soft_version == '' and self.session == '': self.__sbcurrent() allowed_session = ['FastSim', 'FullSim'] if self.session not in allowed_session: raise ApplicationConfigurationError(None, 'session must be %s' % allowed_session) if self.number_of_subjobs < 1 or self.number_of_subjobs > 250: raise ApplicationConfigurationError(None, 'number_of_subjobs must be between 1 and 250') sql = '''SELECT DISTINCT soft_version FROM session_site_soft WHERE session_name = %s ORDER BY soft_version''' supported_soft_version = db.read(sql, (self.session, )) # convert from list of dict to list of strings supported_soft_version = [s['soft_version'] for s in supported_soft_version] if self.soft_version not in supported_soft_version: raise ApplicationConfigurationError(None, 'supported soft_version are: %s' % supported_soft_version) if self.session == 'FastSim' and self.background_frame == True: results = db.read('''SELECT prod_series, lfn_dir FROM background_frame WHERE valid = true ORDER BY validation_timestamp DESC LIMIT 1''') self.input_path = list() self.input_path.append(results[0]['lfn_dir']) self.input_mode = 'dir' self.background_frame_prod_series = results[0]['prod_series'] logger.info('Last approved (%s) background frame has been setup \ for job input' % self.background_frame_prod_series)
def getprograms(self): sql = """SELECT program_id, code, name, 0 AS running FROM programs WHERE module = {0} ORDER BY default_program DESC, name""".format(db.dbstr(self.name())) programs = db.read(sql) for program in programs: if program['program_id'] == self.program['program_id']: program['running'] = 1 return programs
def setdefaultprogram(self): sql = """SELECT program_id, code, relative_times, repeat_program FROM programs WHERE module = {0} AND (default_program = 1 OR selected = 1) ORDER BY selected DESC, default_program DESC""".format(db.dbstr(self.name())) programs = db.read(sql, 0, 1) self.program = { 'program_id':programs[0]['program_id'], 'code':programs[0]['code'], 'relative_times':programs[0]['relative_times'], 'repeat_program':programs[0]['repeat_program'] }
def plot_difficulty_distribution_cfs(cfs_max_score): ''' | Plot the distribution of number of problems vs difficulty level max and min points for a competiton for codeforces | Input | - *cfs_max_score* : Defines the default max score for a competiton on codeforces. For further information on this variable, check problem module | - *app_name* : Database config being used | Conclusion from the plots when used with codeforces data : | - Not all the problems can be fetched by the codeforces API. So the initial plots had some outliers. Similar results were discovered using plots from *plot_points_distribution_cfs()*. Using *cfs_max_score* variable cleared these outliers. ''' a=time.time() sql = "SELECT P.pid, P.points, P.points/GREATEST("+cfs_max_score+", (SELECT MAX(points) FROM problem \ WHERE contestId = P.contestId)) as difficulty FROM problem P \ WHERE MID(P.pid,1,3) = \"cfs\" AND P.points>0" # print sql conn = db.connect() cursor = conn.cursor() result = db.read(sql, cursor) print "time to execute sql = ", time.time()-a difficulty = {} for i in result: pid = str(i[0].encode('utf8')) point = float(i[2]) if point in difficulty: difficulty[point]+=1 else: difficulty[point] = 1 sorted_difficulty = sorted(difficulty.items(), key=operator.itemgetter(0), reverse = 1) x = [] y = [] for i in sorted_difficulty: x.append(i[0]) y.append(i[1]) # print i, " ", i[0]*3000 plt.figure() plt.plot(difficulty.keys(), difficulty.values(), 'ro') plt.plot(x, y) plt.show() cursor.close() conn.close()
def fetch_all_user_activity_cfs(handle=""): ''' | Fetch User's activity from Codeforces | It is different from *fetch_user_activity_cfs()* as it logs each submission as a seperate entry to plot the concept trail ''' difficulty = 0 payload = {} payload['handle'] = handle handle = 'cfs' + handle sql = "SELECT created_at FROM activity_concept WHERE handle = \'" + handle + "\' ORDER BY created_at DESC LIMIT 1;" res = db.read(sql, cursor) if res == (): last_activity = 0 else: last_activity = int(res[0][0]) r = requests.get(cfs_url, params=payload) if(r.status_code != 200 ): print r.status_code, " returned from ", r.url else: result = r.json()['result'] #profile reverse operation result.reverse() count = 1 sql = "INSERT INTO activity_concept (handle, pid, attempt_count, status, difficulty, created_at) VALUES " for act in result: #checking for min of the 2 values as for some cases, codeforces api is returning absured results for relatice time relative_time = min(7200, int(act['relativeTimeSeconds'])) submission_time = int(act['creationTimeSeconds']) + relative_time if submission_time > last_activity: status = str(act['verdict']) if(status == "OK"): status = "1" else: status = "0" sql+="(\'" + handle + "\', \'cfs" + str(act['problem']['contestId']) + str(act['problem']['index']) + "\', '1', " + status + ", " + str(difficulty) + ", " + str(submission_time) +" ), " count+=1; if(count%5000 == 0): sql = sql[:-2] db.write(sql, cursor, conn) print count, " entries made in the database" sql = "INSERT INTO activity_concept (handle, pid, attempt_count, status, difficulty, created_at) VALUES " else: break # print sql # print count if(sql[-2] == ","): sql = sql[:-2] db.write(sql, cursor, conn)
def fetch_activity(): # erd_users = fetch_user_list_erd() sql = "SELECT uid, erd_handle FROM user" erd_users = db.read(sql, cursor) for i in erd_users: uid = str(i[0]) handle = str(i[1].encode('utf8')) fetch_user_activity_erd(uid, handle) # print "User activity for " + handle sql = "UPDATE user SET erd_score = \ (SELECT SUM((correct_count-3)/attempt_count) FROM problem WHERE pid IN \ (SELECT DISTINCT(pid) FROM activity WHERE uid = user.uid AND MID(pid,1,3)=\'erd\' AND status = 1)\ AND correct_count>3)" # print sql db.write(sql, cursor, conn)
def downloadDataset(self, **kwargs): '''to retrieve all files belonging to a owned dataset from GRID to submission machine''' # TODO: create surl file lists beside the lfn list to permit lcg-cp #fail over chain implamantation and to permit the direct plugin # subjob configuration by user given list kwargs['owner'] = utils.getOwner() kwargs['files'] = 0 datasets = self.getDataset(**kwargs) dataset = self.printDatasets(datasets) dataset_id = dataset['dataset_id'] files = dataset['files'] occupancy_human = dataset['occupancy_human'] home = os.path.expanduser('~') s = os.statvfs(home) free_disk = utils.sizeof_fmt_binary(s.f_bsize * s.f_bavail) #print('\nFree disk space: %s' % free_disk) print('\nTotal download size: %s\n' % occupancy_human) sql = 'SELECT lfn FROM analysis_output WHERE dataset_id = %s' lfns = db.read(sql, (r'\x' + dataset_id, )) localdir = os.path.join(home, dataset_id) os.mkdir(localdir) print('Downloading to %s ...' % localdir) i = 1 for lfn in lfns: source = lfn['lfn'] destination = os.path.join(localdir, source.split('/')[-1]) process = subprocess.Popen(['lcg-cp', source, destination], stdout=subprocess.PIPE, close_fds=True) outData, errData = process.communicate() retCode = process.poll() if retCode != 0: raise Exception('lcg-cp fail with return code %d' % retCode) sys.stdout.write('\b' * 80 + '%s/%s' % (str(i), str(files))) sys.stdout.flush() i += 1