def test1(self): coll1 = Collector() coll2 = Collector() n = 40 s1 = 0 s2 = 0 q1 = 0 q2 = 0 for i in range(n): x1 = uniform(5, 15) x2 = uniform(20, 30) coll1.add(x1) coll2.add(x2) s1 += x1 s2 += x2 q1 += x1 * x1 q2 += x2 * x2 self.assertAlmostEqual(q1, coll1.sum_squares(), 5) self.assertAlmostEqual(q2, coll2.sum_squares(), 5) self.assertAlmostEqual(s1, coll1.sum(), 5) self.assertEqual(i + 1, coll1.count()) self.assertAlmostEqual(s2, coll2.sum(), 5) self.assertEqual(i + 1, coll2.count()) self.assertAlmostEqual(math.sqrt(q1 / n - s1 * s1 / n / n), coll1.standard_deviation(), 5) self.assertAlmostEqual(math.sqrt(q2 / n - s2 * s2 / n / n), coll2.standard_deviation(), 5) self.assertAlmostEqual(q1 / n - s1 * s1 / n / n, coll1.variance(), 5) self.assertAlmostEqual(q2 / n - s2 * s2 / n / n, coll2.variance(), 5) self.assertAlmostEqual(s1 / n, coll1.average(), 5) self.assertAlmostEqual(s2 / n, coll2.average(), 5)
def RUN(env='Tennis-ramNoFrameskip-v4', client_mode=False, collector_ip=None, size=15, n_server=4, n_send=1, epsilon=0.0, checkpoint_dir='checkpoint/', problem='MOP3', start_from=None, max_gen=1e10, gpu=False, tunnel=False): if start_from == 'latest': pass psw = '""' if tunnel: psw = getpass.getpass() if not client_mode and problem == 'ALL' or problem == 'SPEC': if start_from is not None: _, _, filenames = next(os.walk(start_from)) array = ['SOP1', 'SOP2', 'MOP1', 'MOP2', 'MOP3' ] if problem == 'ALL' else ['SOP3', 'MOP4'] for p in array: target = None if start_from is not None: for f in filenames: if p in f: target = f break if target is not None: collector = Collector(env, size, n_server, n_send, epsilon, checkpoint_dir, p, start_from + target, client_mode, collector_ip, max_gen, gpu, psw) else: collector = Collector(env, size, n_server, n_send, epsilon, checkpoint_dir, p, None, client_mode, collector_ip, max_gen, gpu, psw) else: collector = Collector(env, size, n_server, n_send, epsilon, checkpoint_dir, p, None, client_mode, collector_ip, max_gen, gpu, psw) collector.main_loop() time.sleep(1) else: collector = Collector(env, size, n_server, n_send, epsilon, checkpoint_dir, problem, start_from, client_mode, collector_ip, max_gen, gpu, psw) collector.main_loop()
def collect_and_publish_summary(self, collect_for_debug=False): """ Collect summary properties for stages and publish to namespace 'summary', with data id set to {stage}. Args: collect_for_debug: if set to True, summaries for both DEBUG and STAGE group would be published. """ # collect summary for all stages and save to local snapshot base after decoding c = Collector(self.nacos_snapshot_repo_dir) with tempfile.TemporaryDirectory() as tmp_dir: c.generate_all_stages_summary(tmp_dir, collect_for_debug) c.encode_properties(tmp_dir, os.path.join(tmp_dir, "nacos.xml")) # publish summary property file to Nacos threads = len(self.stage_to_namespace_ids.keys()) if collect_for_debug: threads = threads + threads p = Pool(threads) for stage in self.stage_to_namespace_ids.keys(): p.apply_async(self.publish_one_stage_summary, args=(stage, )) if collect_for_debug: p.apply_async(self.publish_one_stage_summary, args=(stage, True)) p.close() p.join()
def __init__(self, cookie, maxnum=200, capacity=1024): self.num = maxnum self.cookie = cookie self.url = "https://www.pixiv.net/ajax/user/" + USER_ID + "/illusts" self.headers = BROWSER_HEADER self.collect_cnt = 0 self.collector = Collector(cookie, capacity)
def urlMonitoring(environ, response): ''' WSGI-based URL monitoring web service. NOTE: Only /metrics path supported :param environ: system env variables :param response: response binder :return: list: [utf-8 encoded prometheus metrics] ''' setup_testing_defaults(environ) if environ['PATH_INFO'] == '/metrics': logger.info('Start collecting metrics') res = [] if CONFIG_PATH_ENV in environ: config = Config(environ[CONFIG_PATH_ENV]) for url in config.getTargets(): registry = CollectorRegistry() connector = HTTPConnector(url) try: Collector(registry, connector).do() except Exception as e: logger.error('Exception happened: ' + str(e)) status = '500 Internal Server Error' response(status, [('Content-Type', 'text/plain')]) return [] res.append(generate_latest(registry)) status = '200 OK' headers = [('Content-type', 'text/plain; charset=utf-8')] logger.debug('Successfully collected ' + str(res)) response(status, headers) return res else: response('404 Not Found', [('Content-Type', 'text/plain')]) return [b'Not found']
def spawn_collectors(c): root_url = "http://%s:%s/%s" % (c['hostname'], c['port'], c['base_url']) pprint(c) for data_type_name in c['data_type']: data_type = c['data_type'][data_type_name] if data_type['enable'] == False: continue filter_dict = data_type.get('filter', None) for endpoint_name in data_type['endpoints']: endpoint = data_type['endpoints'][endpoint_name] url = root_url + endpoint['url'] interval = endpoint['interval'] prom_key = "%s:%s" % (data_type_name, endpoint_name) labels = endpoint['labels'] values = endpoint['values'] print(data_type_name, endpoint_name) pc = PrometheusClient(c['prometheus'], prom_key, labels, values) csv = CSVWriter(c['csvwriter'], prom_key, labels + values) if data_type_name == 'queue' and endpoint_name == 'config': worker = QueueConfigCollector(c['dpid'], url, interval, pc, csv, filter_dict) worker.start() continue worker = Collector(c['dpid'], url, interval, pc, csv, filter_dict) worker.start()
def __init__(self, agent: Agent, env: UnityEnvironment, config: Dict[str, Any]): super().__init__(agent, env, config) default_config = { "steps": 2048, # Tensorboard settings "tensorboard_name": None, # str, set explicitly # PPO "ppo_config": { # GD settings "optimizer": "adam", "optimizer_kwargs": { "lr": 1e-4, "betas": (0.9, 0.999), "eps": 1e-7, "weight_decay": 0, "amsgrad": False }, "gamma": .99, # Discount factor # PPO settings "ppo_steps": 25, # How many max. gradient updates in one iterations "eps": 0.1, # PPO clip parameter "target_kl": 0.01, # KL divergence limit "value_loss_coeff": 0.1, "entropy_coeff": 0.1, "max_grad_norm": 0.5, # Backpropagation settings "use_gpu": False, } } self.config = with_default_config(config, default_config) self.collector = Collector(agent=self.agent, env=self.env) self.ppo = PPOptimizer(agent=agent, config=self.config["ppo_config"]) # Setup tensorboard self.writer: SummaryWriter if self.config["tensorboard_name"]: dt_string = datetime.now().strftime("%Y-%m-%d_%H-%M-%S") self.path = Path.home( ) / "drlnd_logs" / f"{self.config['tensorboard_name']}_{dt_string}" self.writer = SummaryWriter(str(self.path)) # Log the configs with open(str(self.path / "trainer_config.json"), "w") as f: json.dump(self.config, f) with open(str(self.path / f"agent_config.json"), "w") as f: json.dump(self.agent.model.config, f) self.path = str(self.path) else: self.writer = None
def test_enhances_caller(self): assembly = """ 00000098 <pbl_table_addr>: 8e4: f000 f824 bl 930 <app_log> 00000930 <app_log>: $t(): """ c = Collector() self.assertEqual(2, c.parse_assembly_text(assembly)) self.assertTrue(c.symbols.has_key(0x00000098)) self.assertTrue(c.symbols.has_key(0x00000930)) pbl_table_addr = c.symbols[0x00000098] app_log = c.symbols[0x00000930] self.assertFalse(pbl_table_addr.has_key("callers")) self.assertFalse(pbl_table_addr.has_key("callees")) self.assertFalse(app_log.has_key("callers")) self.assertFalse(app_log.has_key("callees")) c.enhance_call_tree() self.assertEqual(pbl_table_addr["callers"], []) self.assertEqual(pbl_table_addr["callees"], [app_log]) self.assertEqual(app_log["callers"], [pbl_table_addr]) self.assertEqual(app_log["callees"], [])
def CFagent(defaults): env = Game(**defaults) mover = Mover(env, _extra_dim=1, **defaults) teleporter = Teleporter(env, **defaults) buffer = ReplayBuffer(**defaults) CFagent = CFAgent(env, **defaults) CFbuffer = CFReplayBuffer(**defaults) collector = Collector(**defaults) with Save(env, collector, mover, teleporter, CFagent, **defaults) as save: intervention_idx, modified_board = teleporter.pre_process(env) dones = CFagent.pre_process(env) CF_dones, cfs = None, None for frame in loop(env, collector, save, teleporter): CFagent.counterfact(env, dones, teleporter, CF_dones, cfs) modified_board = teleporter.interveen(env.board, intervention_idx, modified_board) actions = mover(modified_board) observations, rewards, dones, info = env.step(actions) modified_board, modified_rewards, modified_dones, teleport_rewards, intervention_idx = teleporter.modify(observations, rewards, dones, info) buffer.teleporter_save_data(teleporter.boards, observations, teleporter.interventions, teleport_rewards, dones, intervention_idx) mover.learn(modified_board, actions, modified_rewards, modified_dones) board_before, board_after, intervention, tele_rewards, tele_dones = buffer.sample_data() teleporter.learn(board_after, intervention, tele_rewards, tele_dones, board_before) collector.collect([rewards, modified_rewards, teleport_rewards], [dones, modified_dones]) CF_dones, cfs = CFagent.counterfact_check(dones, env, **defaults) CFbuffer.CF_save_data(CFagent.boards, observations, CFagent.counterfactuals, rewards, dones, CF_dones) CFboard, CFobs, cf, CFrewards, CFdones1 = CFbuffer.sample_data() CFagent.learn(CFobs, cf, CFrewards, CFdones1, CFboard)
def test_parses_assembly_and_stops_after_function(self): assembly = """ 000034fc <window_raw_click_subscribe>: $t(): 34fc: b40f push {r0, r1, r2, r3} 34fe: 4901 ldr r1, [pc, #4] ; (3504 <window_raw_click_subscribe+0x8>) 3500: f7fc bdc2 b.w 88 <jump_to_pbl_function> $d(): 3504: 000004c4 .word 0x000004c4 3508: 00040000 .word 0x00040000 350c: 000b008d .word 0x000b008d 00003510 <.LC1>: .LC1(): 3510: 69727073 .word 0x69727073 3514: 42736574 .word 0x42736574 3518: 31647269 .word 0x31647269 351c: 0036 .short 0x0036 """ c = Collector() self.assertEqual(2, c.parse_assembly_text(assembly)) self.assertTrue(c.symbols.has_key(0x000034fc)) self.assertEqual(c.symbols[0x000034fc]["name"], "window_raw_click_subscribe") # print "\n".join(c.symbols["000034fc"]["asm"]) self.assertEqual(len(c.symbols[0x000034fc]["asm"]), 8)
def make_app(): """Instantiates Flask app, attaches collector database, installs acl.""" LOG.info('Starting API') app = flask.Flask(__name__) app.register_blueprint(v1.blueprint, url_prefix='/v1') collector = Collector() collector.clean() thread.start_new_thread(listen, (collector.add, )) @app.before_request def attach_config(): flask.request.collector = collector collector.lock.acquire() @app.after_request def unlock(response): collector.lock.release() return response # Install the middleware wrapper if cfg.CONF.acl_enabled: acl.install(app, cfg.CONF) return app
def metateleport(defaults): collector = Collector(**defaults) env = Game(**defaults) mover = Mover(env, _extra_dim=1, **defaults) teleporter1 = Teleporter(env, _extra_dim=1, **defaults) teleporter2 = MetaTeleporter(env, **defaults) buffer1 = ReplayBuffer(**defaults) buffer2 = ReplayBuffer(**defaults) with Save(env, collector, mover, teleporter1, teleporter2, **defaults) as save: intervention_idx2, modified_board2 = teleporter2.pre_process(env) intervention_idx1, _ = teleporter1.pre_process(env) for frame in loop(env, collector, save, teleporter1, teleporter2): modified_board2 = teleporter2.interveen(env.board, intervention_idx2, modified_board2) modified_board1 = teleporter1.interveen(env.board, intervention_idx1, modified_board2) actions = mover(modified_board1) observations, rewards, dones, info = env.step(actions) modified_board1, modified_board2, modified_rewards1, modified_rewards2, modified_dones1, modified_dones2, tele_rewards, intervention_idx1, intervention_idx2 = teleporter2.metamodify(observations, rewards, dones, info, teleporter1.interventions) buffer1.teleporter_save_data(teleporter1.boards, modified_board2, teleporter1.interventions, modified_rewards2, modified_dones2, intervention_idx1) buffer2.teleporter_save_data(teleporter2.boards, observations, teleporter2.interventions, tele_rewards, dones, intervention_idx2) mover.learn(modified_board1, actions, modified_rewards1, modified_dones1) board_before, board_after, intervention, tel_rewards, tele_dones = buffer1.sample_data() teleporter1.learn(board_after, intervention, tel_rewards, tele_dones, board_before) board_before, board_after, intervention, tel_rewards, tele_dones = buffer2.sample_data() teleporter2.learn(board_after, intervention, tel_rewards, tele_dones, board_before) collector.collect([rewards, modified_rewards1, modified_rewards2, tele_rewards], [dones, modified_dones1, modified_dones2])
def run(self): print("==== Checking hosts on ====") # Here we can block by online workers (only run if haver 4 workers), # but we can work with less then 4 workers, and in my test, I have only 1. num_hosts = self._get_hosts_on(1) print("Hosts ON: %d" % num_hosts) print("==== Creating random matrices ====") self.matrix_a = self._get_random_matrix(self.a_n, self.a_m) self.matrix_b = self._get_random_matrix(self.b_n, self.b_m) print("==== Created Matrices: ====") self._print_two_matrices(self.matrix_a, self.matrix_b) self.matrix_divide() distributor = Distributor("*", "50010") collector = Collector("*", "50012", 5000) distributor.send_jobs(self._create_jobs()) # For test, check services in rasp's self._check_services(1) results = collector.collect(4) if 'err' in results: print("Error in some RasPI: %s" % results['err']) exit() print("==== Appending matrices ====") C1 = self._matrix_sum(results['A1B1'], results['A2B2']) C2 = self._matrix_sum(results['A3B1'], results['A4B2']) C = C1 + C2 print("==== Final result: ====") self._print_matrix(C)
async def init(self): # rabbitmq消息队列连接池初始化 try: await RabbitmqPool().init( addr=self._config.RabbitMQ.addr, port=self._config.RabbitMQ.port, username=self._config.RabbitMQ.username, password=self._config.RabbitMQ.password, vhost=self._config.RabbitMQ.vhost, max_size=self._config.RabbitMQ.max_pool_size) except Exception as ex: self._logger.error("Rabbitmq connection faild, %s. exit 1", ex) sys.exit(1) # 初始化mongodb连接池 MongoPool( host=self._config.MongoDB.addr, port=self._config.MongoDB.port, maxPoolSize=self._config.MongoDB.max_pool_size, minPoolSize=self._config.MongoDB.min_pool_size, ) # 启动数采模块 Collector(warehouse=self._config.Collector.warehouse).start() # 启动api接口 application = Application(url) addr = self._config.Base.addr port = self._config.Base.port application.listen(address=addr, port=port) self._logger.debug("Http api server start %s:%s", addr, port)
def test_collector_download_destination_path_does_not_exist(): timestamp = datetime.datetime.now().strftime("%Y%m%d.%H") uta_label = "#" + "M.U1" + "." + timestamp + "#" filename = "ia.icama." + uta_label + ".zip" collector = Collector() collector.connect(host="192.168.179.130", username="******", password="******") collector.download(path="/home/fastfile/ewsd", filename=filename, destination_path="c:\\Zdenek\\_tmp\\nonexistant\\")
def __init__(self, debug=False, collector=Collector()): self.debug = debug self.collector = collector self.counter = 0 self.index_to_handlers = defaultdict( PriorityQueue) # index -> list of function self.handler_to_trigger_info = {} # function -> trigger_info
def test_collector_delete_insufficient_permissions_on_requested_file(): timestamp = datetime.datetime.now().strftime("%Y%m%d.%H") uta_label = "#" + "M.U1" + "." + timestamp + "#" filename = "ia.icama." + uta_label + ".zip" collector = Collector() collector.connect(host="192.168.179.130", username="******", password="******") collector.delete(path="/home/fastfile/transfer", filename=filename)
def test_collector_download_invalid_request(): timestamp = datetime.datetime.now().strftime("%Y%m%d.%H") uta_label = "#" + "M.U1" + "." + timestamp + "#" filename = "ia.icama." + uta_label + ".zip" collector = Collector() collector.connect(host="192.168.179.130", username="******", password="******") collector.download(path=None, filename=None, destination_path=None)
def main(): ''' Function which starts the tool. ''' # setup logging setupLogging() # get start and end dates for collecting if given from config file startDate, endDate = _getStartAndEnd() # get the list of datanodes we will be collecting measurements from iotTicket.getDataNodes() # create FIWARE entities to Orion from the buses we collect data from if not already created fiware.createEntities() if not fiware.sendToQl: # we will send measurements to QuantumLeap through Orion subscription(s) so create them if not already created fiware.addSubscription() # create the collector that takes care of the actual collection process myCollector = Collector(startDate, endDate) try: # and start collecting myCollector.startCollecting() except KeyboardInterrupt: log.info('Got keyboard interrupt. Collecting stopped.') except: log.exception('Unexpected exception occurred.') exit() log.info('Data collection done.')
def test_collector_delete_source_path_does_not_exist(): timestamp = datetime.datetime.now().strftime("%Y%m%d.%H") uta_label = "#" + "M.U1" + "." + timestamp + "#" filename = "ia.icama." + uta_label + ".zip" collector = Collector() collector.connect(host="192.168.179.130", username="******", password="******") collector.delete(path="/home/fastfile/nonexistant", filename=filename)
def __init__(self, segmentsCount, totalLength): assert ( segmentsCount > 0 ), "TrainTrafficSystem invalid segmentsCount: {}".format(segmentsCount) assert ( totalLength > 0 ), "TrainTrafficSystem invalid total length: {}".format(totalLength) self.segmentsCount = segmentsCount self.totalLength = totalLength self.L = float(totalLength) / float(segmentsCount) CoupledDEVS.__init__(self, "system") # Submodels generator = self.addSubModel( Generator(IATMin=1, IATMax=2, aMin=1, aMax=10, vMax=400 / 3.6)) queue = self.addSubModel(Queue()) segments = [ self.addSubModel(RailwaySegment(self.L)) for x in range(self.segmentsCount) ] self.collector = self.addSubModel(Collector()) # Connections self.connectPorts(generator.outport, queue.inport) l1 = [queue] + segments l2 = segments + [self.collector] for current, nxt in zip(l1, l2): # connect current to nxt self.connectPorts(current.qSend, nxt.qRecv) self.connectPorts(nxt.qSack, current.qRack) self.connectPorts(current.trainOut, nxt.trainIn)
def __init__(self, artist_id, cookie, capacity=1024): self.url = 'https://www.pixiv.net/ajax/user/' + artist_id + '/profile/all?lang=zh' self.ref = 'https://www.pixiv.net/users/' + artist_id + '/illustrations' self.headers = {'x-user-id': USER_ID} self.headers.update({'Referer': self.ref}) self.cookie = cookie self.collector = Collector(cookie, capacity)
def test_enhance_sibling_symbols(self): c = Collector() aeabi_drsub = { collector.ADDRESS: "0000009c", collector.SIZE: 8, collector.TYPE: collector.TYPE_FUNCTION, } aeabi_dsub = { collector.ADDRESS: "000000a4", collector.SIZE: 4, collector.TYPE: collector.TYPE_FUNCTION, } adddf3 = { collector.ADDRESS: "000000a8", collector.SIZE: 123, collector.TYPE: collector.TYPE_FUNCTION, } c.symbols = {int(f[collector.ADDRESS], 16): f for f in [aeabi_drsub, aeabi_dsub, adddf3]} c.enhance_sibling_symbols() self.assertFalse(aeabi_drsub.has_key(collector.PREV_FUNCTION)) self.assertEqual(aeabi_dsub, aeabi_drsub.get(collector.NEXT_FUNCTION)) self.assertEqual(aeabi_drsub, aeabi_dsub.get(collector.PREV_FUNCTION)) self.assertEqual(adddf3, aeabi_dsub.get(collector.NEXT_FUNCTION)) self.assertEqual(aeabi_dsub, adddf3.get(collector.PREV_FUNCTION)) self.assertFalse(adddf3.has_key(collector.NEXT_FUNCTION))
def parse_options(g_args): g_args = g_args or qdict() args = qdict() # collector args.collector = Collector() # timecore args.timecore = g_args.timecore or TimeCore() # padding if g_args.padding == "history": args.padding = prefer.HistoryPadding() elif g_args.padding == "future": args.padding = prefer.FuturePadding() else: args.padding = prefer.RecentPadding() # fulltime args.fulltime = g_args.fulltime if g_args.fulltime != None else False # infinity args.infinity = prefer.Infinity( g_args.infinity if g_args.infinity != None else (0, 0, 0, 0, 0, 0)) # ambiguous-direct args.ambiguous_direct = prefer.AmbiguousDirect( 1 if args.ambiguous_direct == "future" else -1) return args
def __init__(self, classifier, mic_params, is_audio_record=False, root_path='./'): # arguments self.classifier = classifier self.mic_params = mic_params self.is_audio_record = is_audio_record self.root_path = root_path # plot path self.plot_path = self.root_path + self.mic_params['plot_path'] # create folder for plot path create_folder([self.plot_path]) # shortcuts self.feature_params = classifier.feature_params # feature extractor self.feature_extractor = FeatureExtractor(self.feature_params) # windowing params self.N, self.hop = self.feature_extractor.N, self.feature_extractor.hop # queue self.q = queue.Queue() # collector self.collector = Collector( N=self.N, hop=self.hop, frame_size=self.feature_params['frame_size'], update_size=self.mic_params['update_size'], frames_post=self.mic_params['frames_post'], is_audio_record=self.is_audio_record) # device self.device = sd.default.device[0] if not self.mic_params[ 'select_device'] else self.mic_params['device'] # determine downsample self.downsample = self.mic_params['fs_device'] // self.feature_params[ 'fs'] # get input devices self.input_dev_dict = self.extract_devices() # show devices print("\ndevice list: \n", sd.query_devices()) print("\ninput devs: ", self.input_dev_dict.keys()) # stream self.stream = None # change device flag self.change_device_flag = False
def test_subfix(self): col = Collector() print(col.suffix("abc.TXT")) print(col.suffix("./test_suite.py")) print( col.suffix( r"C:\Users\Qun\PycharmProjects\PhotoCollector\PhotoCollecter\test\test_suite.py" ))
def test_stack_usage_line2(self): line = "puncover.c:8:43:dynamic_stack2 16 dynamic" c = Collector() c.symbols = {"123": { "base_file": "puncover.c", "line": 8, }} self.assertTrue(c.parse_stack_usage_line(line))
def __init__(self, width, height): game_mouse.Game.__init__(self, "Flappy Bird", width, height, 50) self.font_height = 12 self.font = pygame.font.SysFont("arial", self.font_height) self.mCollector = Collector(width, height) return
def main(): c = Collector() socketio.start_background_task(c.collect) socketio.start_background_task(cleanup_database_loop) socketio.run(app, debug=False, host='0.0.0.0', port=config("general", "PORT"))
def __init__(self, file_name): self.map = [] self.hero = None # waiting for the first call of spawn(hero) self.hero_x = -1 self.hero_y = -1 self.collector = Collector( ) # collect spawn cells, enemies and treasures self.level = levels.index(file_name) + 1 self._read_file(file_name)