def test_iteration(self): heap = NodeHeap(mknode(intid=0), 5) nodes = [mknode(intid=x) for x in range(10)] for index, node in enumerate(nodes): heap.push(node) for index, node in enumerate(heap): self.assertEqual(index, node.long_id) self.assertTrue(index < 5)
def test_iteration(self, mknode): # pylint: disable=no-self-use heap = NodeHeap(mknode(intid=0), 5) nodes = [mknode(intid=x) for x in range(10)] for index, node in enumerate(nodes): heap.push(node) for index, node in enumerate(heap): assert index == node.long_id assert index < 5
def test_max_size(self, mknode): # pylint: disable=no-self-use node = NodeHeap(mknode(intid=0), 3) assert not node for digit in range(10): node.push(mknode(intid=digit)) assert len(node) == 3 assert len(list(node)) == 3
def test_maxSize(self): n = NodeHeap(mknode(intid=0), 3) self.assertEqual(0, len(n)) for d in range(10): n.push(mknode(intid=d)) self.assertEqual(3, len(n)) self.assertEqual(3, len(list(n)))
def test_max_size(self): node = NodeHeap(mknode(intid=0), 3) self.assertEqual(0, len(node)) for digit in range(10): node.push(mknode(intid=digit)) self.assertEqual(3, len(node)) self.assertEqual(3, len(list(node)))
def test_remove(self, mknode): # pylint: disable=no-self-use heap = NodeHeap(mknode(intid=0), 5) nodes = [mknode(intid=x) for x in range(10)] for node in nodes: heap.push(node) heap.remove([nodes[0].id, nodes[1].id]) assert len(list(heap)) == 5 for index, node in enumerate(heap): assert index + 2 == node.long_id assert index < 5
def test_remove(self): heap = NodeHeap(mknode(intid=0), 5) nodes = [mknode(intid=x) for x in range(10)] for node in nodes: heap.push(node) heap.remove([nodes[0].id, nodes[1].id]) self.assertEqual(len(list(heap)), 5) for index, node in enumerate(heap): self.assertEqual(index + 2, node.long_id) self.assertTrue(index < 5)
def __init__(self, protocol, http_client, node, chunk_key, peers, ksize, alpha, time_keeper=TimeKeeper()): TalosSpiderCrawl.__init__(self, protocol, node, chunk_key, peers, ksize, alpha) # keep track of the single nearest node without value - per # section 2.3 so we can set the key there if found self.nearestWithoutValue = NodeHeap(self.node, 1) self.http_client = http_client self.time_keeper = time_keeper self.is_first_round = True
def __init__(self, protocol, node, peers, ksize, alpha): """ Create a new C{SpiderCrawl}er. Args: protocol: A :class:`~kademlia.protocol.KademliaProtocol` instance. node: A :class:`~kademlia.node.Node` representing the key we're looking for peers: A list of :class:`~kademlia.node.Node` instances that provide the entry point for the network ksize: The value for k based on the paper alpha: The value for alpha based on the paper """ self.protocol = protocol self.ksize = ksize self.alpha = alpha self.node = node self.nearest = NodeHeap(self.node, self.ksize) self.lastIDsCrawled = [] self.log = Logger(system=self) self.log.info("creating spider with peers: %s" % peers) self.nearest.push(peers)
def __init__(self, protocol, node, peers, ksize, alpha): SpiderCrawl.__init__(self, protocol, node, peers, ksize, alpha) # keep track of the single nearest node without value - per # section 2.3 so we can set the key there if found self.nearest_without_value = NodeHeap(self.node, 1)