def main(): humancount = int(input('Select number of customers to simulate: ')) kioskcount = int(input('Select number of kiosks to simulate: ')) bagcount = int(input('Select starting bag amount: ')) dist = int(input('Select how many bags to distribute per store: ')) cust = make_humans(humancount) for human in cust: human.info() kiosks = build_kiosks(kioskcount) start_bags = first_bags(bagcount) for bag in start_bags: bag.info() # Set up existing Publices pub_tpark = Store('Publix Town Park', 'E. Colonial', 2) pub_cpark = Store('Publix College Park', 'Edgewater', 1) pub_eola = Store('Publix Lake Eola', 'E Central', 3) pub_dphil = Store('Publix Dr. Phillips', 'Dr. Phillips', 4) publices = [pub_tpark, pub_cpark, pub_eola, pub_dphil] for publix in publices: publix.info() publix.inventory() print(f'\nSmartORBS distributed {dist} bags to each store.\n') # Distribute initial bag supply for publix in publices: deliver = random.sample(start_bags, dist) publix.receive(deliver) print(publix.inventory())
def run(self): # wait for the peer to setup it's waiting file buf = self.file.read(Store()['MSS']) print("We now start sending the file ………") time.sleep(1) # last lenth to prevent length%MSS == 0 last_len = 0 while buf: # try to send the buffer to server self.send_buf(buf) # record the last length last_len = len(buf) # expect ack increment self.ack += len(buf) # get new buffer buf = self.file.read(Store()['MSS']) # edge case preventhion if last_len == Store()['MSS']: # send an empty buffer to tell the serfer the file is # ended self.send_buf(buf) print("The file is sent.")
def test_concept_iterator_includes_parent_concepts(self): parent = Store() child = Store(parent) parent.add_concept(Concept.word("p")) child.add_concept(Concept.word("c")) l = set(map(lambda c: c.name, child.concepts())) self.assertEqual(l, {"p", "c"})
def __init__(self, import_name, downstream_channels=[]): super(FlaskGCRun, self).__init__(import_name) self.PROJECT_ID = os.getenv('GCP_PROJECT') if os.getenv('BUCKET_PIPELINE') != None: self._store = Store(os.getenv('BUCKET_PIPELINE')) if os.getenv('BUCKET_OUTPUT') != None: self._output_store = Store(os.getenv('BUCKET_OUTPUT')) self.downstream_channels = downstream_channels self.init_app()
def __init__(self, tokens): self.g_arr = [Github(token, per_page=100) for token in tokens] self.seen_users = Store("/users") # {ID: count} self.seen_repos = Store("/repos") # {ID: int} self.contributors_cache = Store("/cache") # ([userID], [score]) self.token_num = 0 self.last_users = collections.deque( [-1 for _ in range(MAX_SEEN_QUEUE)], MAX_SEEN_QUEUE) self.last_repos = collections.deque( [-1 for _ in range(MAX_SEEN_QUEUE)], MAX_SEEN_QUEUE)
def test_uses_concepts_in_parent(self): parent = Store() child = Store(parent) raw = Concept( None, Relation.Class, [Concept.word('c1'), Concept.word('c2')]) in_parent = parent.integrate(raw) in_child = child.integrate(raw) self.assertEqual(id(in_parent), id(in_child))
def answer_ping(self, msg, addr): client_id = msg.header[1] print("A ping request message was received from Peer " + str(client_id) + ".") # register this server is alive Store()['controller'].add_pre(client_id) # send back to ping client msg = Message() msg.setHeader(RECV_PING, Store()['my_id']) self.sock.sendto(msg.segment, addr)
def run(self): """ This thead to handle it """ data = self.conn.recv(1024) # print(data) msg = Message(data) # handle by header if msg.header[0] == INFO_FILE_REQ: # print("File Request: " + str(msg.header[1]) ) Store()['controller'].handle_file_request(msg.header[1], bytes_to_int(msg.body)) # callback, no response if msg.header[0] == INFO_FILE_RES: # I can start listening to the file Store()['controller'].handle_file_waiting(msg.header[1], bytes_to_int(msg.body)) elif msg.header[0] == INFO_PEER_LOSS: # print("Peer Loss " + str(bytes_to_int(msg.body))) # fetch the suc wich is not the lost one loss_peer = bytes_to_int(msg.body) suc = [Store()['controller'].get_suc(i) for i in range(2)] suc = [s for s in suc if s != loss_peer] # warp the message for new next reply = Message() reply.setHeader(INFO_NEW_PEER, 0) reply.body = int_to_bytes(suc[0]) self.conn.send(reply.segment) elif msg.header[0] == INFO_PEER_EXIT: # gracefully ext a peer print("Peer is gracefully exit " + str(msg.header[1])) # call back the main to update post Store()['controller'].handle_peer_departure( msg.header[1], bytes_to_int(msg.body)) # return the ack reply = Message() reply.setHeader(INFO_EXIT_ACK, 0) self.conn.send(reply.segment) # print(bytes_to_int(msg.body)) # close the connection self.conn.close()
def initialize(start_time=None, min_interval=None, warmup=False, logger=None): logger.info('initialize start_time={} min_interval={} warmup={}'.format( start_time, min_interval, warmup)) global store store = Store( logger=logger, start_time=start_time, ) warpup_bybit = StoreWarpupBybit( store=store, logger=logger, min_interval=min_interval, ) if warmup: warpup_bybit.start() warpup_ftx = StoreWarpupFtx( store=store, logger=logger, min_interval=min_interval, ) if warmup: warpup_ftx.start()
def main(self): """ set everything up, then invoke go() """ (options, args) = self.parser.parse_args() log_level = logging.ERROR if options.debug: log_level = logging.DEBUG elif options.verbose: log_level = logging.INFO logging.basicConfig(level=log_level) #, format='%(message)s') if options.test: self.store = DummyStore(self.name, self.doc_type) else: # load in config file for real run config = ConfigParser.ConfigParser() config.readfp(open(options.ini_file)) auth_user = config.get("DEFAULT", 'user') auth_pass = config.get("DEFAULT", 'pass') server = config.get("DEFAULT", 'server') self.store = Store(self.name, self.doc_type, auth_user=auth_user, auth_pass=auth_pass, server=server) self.go(options)
def test_sunday_burrito_con_carne_price(self): store = Store() store.state = Sunday() factory = store.create_factory("burrito") order = factory.order_dish() self.assertEqual(order.dish._price, BurritoConCarne(Sunday.burrito_discount)._price)
def initialize(type): store = Store() store.dispatch("state", "setType", type) loop = asyncio.get_event_loop() # WIFI loop.create_task(scanCompleteHandler(store)) loop.create_task(scanStartHandler(store)) # STATE # MINIGAME loop.create_task(startMinigameHandler(store)) loop.create_task(resetMinigameHandler(store)) loop.create_task(drawMinigameTask(store)) loop.create_task(resetMinigameLevelHandler(store)) # LED loop.create_task(setBottomLayoutHandler(store)) loop.create_task(setTopLayoutHandler(store)) # BUTTON loop.create_task(buttonOneTask(store)) loop.create_task(buttonTwoTask(store)) loop.run_forever()
def test_three(self): store = Store("./test.json") key, value, ttl = "k", {"a": 1}, 5 store.create(key, value, ttl) time.sleep(ttl + 1) with pytest.raises(KeyError): store.read(key)
def test_on_disconnected_store_cache_set_cache_get(self): self.store = Store(port=9999, connect_timeout=1, attempts=1) key = "uid:c20ad4d76fe97759aa27a0c99bff6710" self.store.cache_set(key, -1, 60) value = self.store.cache_get(key) or 0 self.assertEqual(value, 0)
def run_tests(): """Test Catalogue class.""" # Test empty Catalogue (defaults) print("Test empty Catalogue:") catalogue = Catalogue() assert not catalogue.stores # an empty list is considered False # Test loading stores print("\nTest loading Stores:") catalogue.load_stores('store_saves.csv') for shop in catalogue.list_stores(): print(f'\n{shop}') print("\nTest listing shop inventory") for item in shop.inventory.list_items(): print(item) assert catalogue.stores # assuming CSV file is non-empty, non-empty list is considered True # Test adding a new Shop with values print("\nTest adding new Shop:") catalogue.add_store(Store("Squelons Wood", 300, 500, "I sell wood!")) for shop in catalogue.list_stores(): print(shop) # Test saving Stores (check CSV file manually to see results) catalogue.save_stores('test.csv')
def setUp(self): unittest.TestCase.setUp(self) agentFactory = PoleAgentFactory() environmentFactory = PoleEnvironmentFactory() trainerFactory = PoleTrainerFactory() buildParameterFactory = PoleBuildParameterFactory() store = Store(self.dbPath) logger = MyLogger(console_print=True) self.builder = Builder(trainerFactory, agentFactory, environmentFactory, store, logger) self.buildParameters = [] for k1 in range(2): nIntervalSave = 3 nEpoch = 5 self.buildParameters.append( PoleBuildParameter(int(nIntervalSave), int(nEpoch), label="test" + str(k1))) for agentClass in ("agent002", "agent003", "agent004"): self.buildParameters.append( PoleBuildParameter(int(nIntervalSave), int(nEpoch), agentClass=agentClass, label="test " + agentClass)) self.loader = Loader(agentFactory, buildParameterFactory, environmentFactory, store)
def test_two(self): store = Store("./test.json") key, value = "1", {"a": 1} store.create(key, value) with pytest.raises(KeyError): store.create(key, value) store.delete(key)
def test_on_disconnected_store_get_score(self, kwargs): self.store = Store(port=9999, connect_timeout=1, attempts=1) score = kwargs.pop('score') kwargs['birthday'] = api.DateField.str_to_date(kwargs['birthday']) self.assertAlmostEqual(scoring.get_score(self.store, **kwargs), score, delta=0.1)
def test_four(self): store = Store("./test.json") key, value, ttl = "k", {"a": 1}, 5 store.create(key, value, ttl) time.sleep(ttl - 1) assert json.loads(store.read(key)) == value store.delete(key)
def arrange_sim(pcid, cid, suffix=None): from store import Store store = Store(pcid=pcid, cid=cid) if suffix is not None: store.add_path_suffix(suffix) similarity = store.load("similarity") map_urls = store.load("map_urls") data = list() for key, sub_dict in similarity.items(): line = list() for sub_key, val in sub_dict.items(): line.append([sub_key, val]) line.sort(key=lambda x: x[1], reverse=True) record = [key] for item in line: record.append(item[0]) record.append(str(round(item[1], 2)) + "%") try: record.append(map_urls[key]) data.append(record) except KeyError: print("map url KeyError", key) columns = [ "brand-model", "1st sm", "1st share", "2nd sm", "2nd share", "3rd sm", "3rd share", "4th sm", "4th share", "5th sm", "5th share", "image_url" ] df = pd.DataFrame(data, columns=columns) df.to_csv(store.path.format(name="arrange_result").replace(".txt", ".csv"), encoding="utf-8")
def main(): logging.basicConfig(format='%(asctime)s %(message)s', datefmt='%m/%d/%Y %I:%M:%S %p') db = Store() reader = CsvReader() data = reader.get_data_from_csv('Monitoring report.csv') if data is None: exit conn = db.create_connection(db_file='monitoring-report.db') if conn is None: exit # Create the monitoring table sql_create_table = "CREATE TABLE IF NOT EXISTS monitoring(date, energy, reactive_energy, power, maximeter, reactive_power, voltage, intensity, power_factor)" if db.create_table(conn, sql_create_table) is None: exit values = [] for index, row in data.iterrows(): values.append(row) db.insert_data(conn, values) conn.close()
def test002(self): store = Store(self.dbPath) assert isinstance(store, Store) for k1 in range(2**3): buildParameter = BuildParameter(label="test" + str(k1)) agent = Agent() for epoch in range(2**4): agentMemento = agent.createMemento() buildParameterMemento = buildParameter.createMemento() buildParameterKey = buildParameter.key buildParameterLabel = buildParameter.label storeField = StoreField(agentMemento, epoch, buildParameterMemento, buildParameterKey, buildParameterLabel) assert isinstance(storeField, StoreField) store.append(storeField) store.update_db() for storeField in store.restore("test%", 1): assert isinstance(storeField, StoreField)
def Main(): config = readConfig() exchange = Exchange(config) pairs = config['exchange']['pair_list'] exchange.ValidatePairs(pairs) store = Store() telegram = Telegram(config) while True: for pair in pairs: print(f"Get {pair} {timeframe}") df = exchange.fetchOHLCV(pair, timeframe, f"{size} hours ago") if not df.empty: df_old = store.load_data(pair,timeframe) df = calculate_gainer(df) if not df_old.empty: df_old.append(df) else: df_old = df store.store_data(pair=pair,timeframe=timeframe,data=df_old) print(f"Returned {pair} {df.size} lines") else: print(f"Returned {pair} Empty!")
def create_app(): app = Flask(__name__) app.user = User() app.userList = UserList() app.usermap = UserLocationStore() app.friendStore = FriendStore() app.messageStore = MessageStore() app.userlocation = UserLocation() app.register_blueprint(site) app.register_blueprint(myMap) app.register_blueprint(register) app.register_blueprint(adminTable) app.register_blueprint(add) app.register_blueprint(add_doc) app.register_blueprint(image) app.register_blueprint(event) app.register_blueprint(notifications) app.store = Store() app.commentStore = CommentStore() app.requestStore = RequestStore() app.store_images = Store_Image() app.store_documents = Store_Document() app.register_blueprint(friends) app.register_blueprint(messages) app.register_blueprint(new) app.time = Timesql() app.init_db = init_db() app.savelocation = SaveLocation() app.notificationStore = NotificationStore() # app.store.add_event(Event('World War II', date='15/12/1942', place='Turkey',content= 'Donec sed odio dui. Etiam porta sem malesuada magna mollis euismod. Nullam id dolor id nibh ultricies vehicula ut id elit')) # app.store.add_event(Event('Train Accident', date='01/02/1985', place='California', content = 'Donec sed odio dui. Etiam porta sem malesuada magna mollis euismod. Nullam id dolor id nibh ultricies vehicula ut id elit')) return app
def dbsetup(readonly=False): 'Set up our connection to Neo4j' ourstore = Store(neo4j.GraphDatabaseService(), uniqueindexmap={}, classkeymap={}) CMAinit(None, readonly=readonly, use_network=False) for classname in GraphNode.classmap: GraphNode.initclasstypeobj(ourstore, classname) return ourstore
def __init__(self, args): np.random.seed(int(time.time())) self.store = Store() self.visual = Visual(self.store) self.image_shape = [28, 28, 1] # 28x28 pixels and black white self.batch_size = args.batch_size self.lr = args.learning_rate self.train_epoch = args.train_epoch self.dropout_keep_probability = tf.placeholder("float") self.mnist = input_data.read_data_sets("MNIST_data/", one_hot=True, reshape=[]) self.is_training = tf.placeholder(dtype=tf.bool) self.x = tf.placeholder(tf.float32, shape=(None, 64, 64, 1), name="X_Input") self.z = tf.placeholder(tf.float32, shape=(None, 1, 1, 100), name="Z") self.G_z = define_generator(self.z, self.is_training) D_real, D_real_logits = define_discriminator(self.x, self.is_training) D_fake, D_fake_logits = define_discriminator(self.G_z, self.is_training, reuse=True) D_loss_real = init_loss(D_real_logits, tf.ones, self.batch_size) D_loss_fake = init_loss(D_fake_logits, tf.zeros, self.batch_size) self.G_loss = init_loss(D_fake_logits, tf.ones, self.batch_size) self.D_loss = D_loss_real + D_loss_fake self.sess = None
def test_cache_get_error_connect(self): store = Store( MemcacheAdapter(address=os.environ['STORE_PORT_11211_TCP_ADDR'], port=8000, ignore_exc=False)) with self.assertRaises(Exception): store.cache_get('error_get_key', )
async def update_store(ctx, *, details): store_details = details.split(':') if len(store_details) == 5: # Correct number of arguments provided old_store_name = store_details[0].strip() new_store_name = store_details[1].strip() new_store_location_x = store_details[2].strip() new_store_location_z = store_details[3].strip() new_store_description = store_details[4].strip() catalogue = Catalogue() catalogue.load_stores(PATH_TO_DATA_FILE) # Attempt to remove old store from catalogue. is_removed = catalogue.remove_store(old_store_name) if not is_removed: embed = discord.Embed( title=f'Store: \'{old_store_name}\' cannot be found', color=13424046) await ctx.send(embed=embed) else: # Add updated store updated_store = Store(new_store_name, new_store_location_x, new_store_location_z, new_store_description) catalogue.add_store(updated_store) # Update save file catalogue.save_stores(PATH_TO_DATA_FILE) embed = discord.Embed(title='Store Updated Successfully', color=13424046) await ctx.send(embed=embed) await stores(ctx) else: embed = discord.Embed(title='Error: Please use the following format', color=13424046) await ctx.send(embed=embed) await help(ctx, 'update_store')
def run_test(): store = Store() store.server_ip = '10.233.93.82' store.id = '12' store.name = 'test' fetcher = FetcherTask(store=store, total=1, num=1) fetcher()
def find_stores(name): stores = [] db = get_connection() cursor = db.cursor() if name: args = '%' + name + '%' sql = ''' SELECT * FROM store where store_name like '{}' '''.format( args) print(sql) else: sql = " SELECT * FROM store " try: cursor.execute(sql) results = cursor.fetchall() for row in results: store = Store() store.store_id = row[0] store.store_name = row[1] store.store_address = row[2] store.store_phone = row[3] store.store_city = row[4] store.store_state = row[5] stores.append(store) except: info = sys.exc_info() print(info[0], ":", info[1]) print("Error: unable to fetch data") db.close() return stores